AI Therapy Pros and Cons
This story was in the March 2026 Bark print magazine.
Technology all around photo by. Macie Groth
AI Therapy Pros
Artificial intelligence is paving the way for a new wave of therapy, offering cost-effective, low-pressure, and immediate support for people seeking help. This counters the rising costs and the increase in people seeking therapy since the COVID-19 pandemic.
Therapy and companionship were ranked the number one use for artificial intelligence in 2025 by the Harvard Business Review.
“Getting an AI therapist perhaps more quickly may be helpful in the short-term, especially when wanting help with learning coping mechanisms, talking things out, and externalizing an experience to help it be less powerful,” said Sonja Wildwood, a dialectical behavioral and cognitive-behavioral therapist.
The CDC reports that the percentage of adults who have received professional human therapy has increased from 9.5% to over 28% between 2019 and 2025.
This has left many psychologists feeling overworked and unable to take new clients. The American Psychological Association (APA) reports that psychologists have increased workloads, longer waitlists, and low capacity for taking on new patients.
“The benefits that I’ve seen are to help improve efficiency and language for case notes. I know others use AI to transcribe their sessions and use that to write their notes, and have said that it’s also helpful,” Wildwood said.
The percent of psychologists who reported getting more referrals increased dramatically after the pandemic (from 37% in 2020 to 62% in 2021), and about 68% of psychologists with a waitlist reported that it had grown since the start of the pandemic.
Furthermore, the APA reported in 2021 that the greatest increases for treatment were for anxiety disorders (84% in 2021, up from 74% in 2020), depressive disorders (72% in 2021, up from 60% in 2020), and trauma and stress disorders (62% in 2021, up from 50% in 2020)
Some of the most common AI apps, such as Wysa, claim to be most effective in treating anxiety, stress and depression. They claim to offer a stress-free alternative therapy with their headline being “Completely anonymous. No stigma. No limits,” touching on the common reasons people may be scared to try traditional therapy.
These AI chatbots are often recommended by psychologists for people to use as a tool to help them in between appointments. This helps lift the workload of therapists who can't answer
people at all hours of the day, and stops people from feeling like they have nobody to turn to in times of crisis.
“There are many waitlists, and it can be hard to find a therapist, coordinate benefits, make and keep an appointment, especially with a limited availability in the schedule as many therapists work 9-5 M-F,” Wildwood shared.
Traditional therapy can be very expensive, especially for those who don't have insurance to cover costs.
Simple Practice reported in 2025 that therapy costs have consistently increased since 2020, with the average cost rising from $123 in 2019 to $139 in 2024, a 12.6% increase.
There are many low-cost and free-to-use AI therapist sites, such as Abby, Theraspeak, and TalkSpace. These sites offer a cost-effective solution for many who can't afford the cost of traditional therapy.
AI Therapy Cons
With mental health awareness becoming a more common theme throughout America, people are starting to look anywhere and everywhere for support. Americans are turning to artificial intelligence as a therapy option. AI therapy can overcome hurdles like cost, mismatched therapists and wait time, but there are downsides.
According to a Stanford article, different AI chatbots have shown stigma towards conditions such as alcohol dependence and schizophrenia. This stigmatizing can cause harm to patients and prevent them from pursuing further mental health treatment.
The biggest draw towards AI is the cost efficiency. One of the biggest reasons people avoid professional mental care is due to the high costs. But according to an article published on Undark by Ramin Skibba, most free AI chatbots were not created to be used for therapy.
In that same article, a computer scientist at the University of Minnesota, Stevie Chancellor, claims chatbots are not suited to deal with users suffering from problems like suicidal thoughts. These bots were trained using social media data where joking about mental health and suicide is far from uncommon.
AI is incapable of understanding true human emotion. “Understanding and empathy for others' emotions at a depth would be limited as AI cannot feel and does not have empathy,” said Sonja Wildwood, a dialectical behavioral and cognitive-behavioral therapist.
A large concern many people have with AI therapy is possible malfunctions within their responses. According to a study conducted by Sentio University, 9% of people using an AI chatbot reported that it gave them an inappropriate response.
This is too big an error rate. An inappropriate response to a serious mental health issue or suicidal threat could result in somebody getting seriously hurt, or in worse cases, even dying.
A gut-wrenching incident occurred in September of 202,5 where a 16-year-old boy committed suicide after confiding in Chat GPT. The chatbot not only discouraged him from seeking help from his parents but also offered to write his suicide note.
It’s important to mention that real mental health professionals can also give inappropriate answers. According to a study published in the National Library of Medicine, they surveyed a little over 2,000 racial minority participants who visited therapists. 81% said that the medical professionals had used inappropriate microaggressions towards them.
The use of AI for therapy is growing, and makes some people uncomfortable, but it is not going away. Professionals can make sure we're using it the right way.
Matthew Meier, a clinical psychology professor at Arizona State University, claims that if you are going to use AI for therapy, it’s important to use chatbots designed for mental health evaluations rather than standard Large Language Models. Some examples would be Earkick, Koko, Therabot, Youper, or Wysa.
Meier’s biggest fear is people cutting off human interaction due to AI. He says, “As of now, 30% of teens find AI conversations as satisfying or more satisfying than human conversations, and 6% spend more time with chatbots than with friends.”
AI is going to change the way we all view mental health, but we cannot use it as a replacement for human interaction.
AI therapy sources
Here's a vis we could use
https://www.simplepractice.com/blog/average-therapy-session-rate-by-state/
By 1966, we saw the beginning of artificial intelligence in psychology, with the early chatbot Eliza convincing patients they were conversing with a real therapist (Mullins, 2005; Weizenbaum, 1976).
https://www.cdc.gov/nchs/products/databriefs/db444.htm
https://www.visualcapitalist.com/ranked-all-the-things-people-use-ai-for-in-2025/
https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
https://www.forbes.com/health/mind/ai-therapy/
https://sentio.org/ai-research/ai-survey