Should AI Be Used in Mental Health Therapy?

milwaukee therapist for perinatal pregnancy postpartum trauma couples anxiety adhd

Artificial intelligence is everywhere right now.

It’s showing up in our phones, our work tools, our creative processes—and increasingly, in healthcare spaces, including mental health.

Note: For this blog, we’re just talking about outpatient therapy practices. The topic of AI as a whole in mental health is a big one, and it’s really complex.

Some therapists are beginning to use AI-powered tools during or around therapy sessions. These include note-taking software, session “assistants,” treatment-planning tools, or platforms that analyze session content.

We want to be very clear about our stance.

At ERA, we do not use AI in therapy sessions, and we do not (and will not) use AI tools that process any client-related information, including for documentation purposes.

And honestly? It feels a little strange that we even have to say that at all.

milwaukee therapy wisconsin couples therapist

Why We’ve Made This Decision

We could probably save time by using AI tools. We could streamline documentation. We could “optimize” our workflows in ways that are increasingly common in healthcare. Maybe we’d even sound smarter.

We’ve chosen not to. Our decision centers on one core value: client privacy matters more than convenience.

Therapy is built on trust, human connection, and being seen as you are (this is also part of why we aren’t big fans of the use of platforms like ChatGPT as therapists, in addition to some pretty big ethical concerns but that’s a blog for another day!). It’s a relational process that develops over time. When someone comes to therapy, they are sharing some of the most vulnerable parts of their inner world. That space is sacred, and deserves the utmost care and protection.

AI tools, by design, often rely on data storage, processing, and learning, and third-party access.

Even when companies claim strong security or “HIPAA-compliant” frameworks, the reality is that data pathways can be complex and difficult to fully verify. Once information leaves the room, control over it often becomes less clear.

For us, that risk alone is enough to pause.

We believe that therapy should be a space where your information is truly private and protected. What you tell us is just between us. You don’t have to wonder where that information is going or how it’s being used, or what is learning from your information.

Your story isn’t something to automate or streamline or optimize. This isn’t about being anti-technology. It’s about being deeply pro-client, which we are and will always be.

Our Commitment to You

At ERA, we are committed to being transparent about the tools we use and prioritizing our clinic values, which include intention, ethics, and people first.

We’ll always tell you what we’re doing and why we’re doing it.

If you ever have questions about our practices, we welcome them. Your trust matters to us.


If you’re seeing a therapist at another practice and aren’t sure whether or not AI is being used, read on for some tips about your rights as a client and how to ask about AI tools being used in your therapy spaces.

Your Rights as a Client

There are laws around how certain healthcare information is stored and transmitted. Most people have heard if HIPAA - the Healthcare Insurance Portability and Accountability Act. This 1996 law regulates how your protected health information (PHI) is managed, transmitted, and stored. It dictates how healthcare providers, including therapists, treat your information and what we can do with it.

It also dictates that we cannot do things with your PHI without your consent. If your therapist is using AI in their practice, you have the right to transparency. You have the right to know whether AI is being used in or around your sessions, know where your information is stored and who has access to it, and give informed consent or to decline without penalty.

If these things haven’t been clearly explained, it’s okay to bring this up and ask.


milwaukee therapy for anxiety ocd burnout healthcare workers professionals

Questions You Are Allowed to Ask

If you’re unsure about how technology is being used in your therapy sessions or documentation, here are some reasonable and appropriate questions:

  • “Are you using AI tools during or related to my sessions or documentation in my chart?”

  • “Does any software record, transcribe, or analyze what I say in our sessions?”

  • “Do you use any AI software to aid in writing clinical documentation related to our sessions?”

  • “Is any of my information use or stored outside of your practice systems?”

  • “Can I opt out of AI-related tools without it affecting my care?”

  • “Can you please stop and explain that more? I didn’t quite understand.”

Your provider should be able to answer these questions clearly and directly. You’re allowed to ask more or clarifying questions, and request to be notified if anything changes.

Caring About This Is Not Overreacting - But it’s Also OK if You’re OK With It

Your preferences matter for your therapy sessions. Wanting your therapy to be fully human, private, and free from AI is ok. It’s also ok if you feel comfortable with your therapist using AI tools in your sessions or for documentation.

There are some valid reasons that some clinicians choose to use AI. It does often save time and can be very helpful for people with disabilities or who are neurodivergent in completing their documentation in a timely manner.

The important thing is consent - you deserve to know how your information is being collected and used, and have autonomy over it. It’s important that your clinician (or other healthcare provider) is taking your privacy seriously and doing everything they can to safeguard your information.



Not a client at ERA yet and want to be?

Schedule a new client consult to learn more about how we can help.

Next
Next

What is Neurodiversity?