# Can AI Replace Human Therapists: An Exploration of Current and Future Possibilities
_2025-12-21_
Several researchers have suggested that, while Artificial Intelligence (AI) can assist in some aspects of running a psychotherapy practice—like notetaking and summarization—therapy requires a human touch that can never be substituted. As a psychology student and aspiring psychotherapist, I want to look at the evidence. Can AI replace human therapists? Is there a special human touch, or is that belief simply human exceptionalism—wishful thinking from humans who are naturally biased? To attempt to answer this question, we’ll evaluate the advantages of AI over human therapists, the temporary and definitive advantages of human therapists over AI, and what the future of therapy might look like as a result.
For the purposes of this paper, I define AI as any non-human system that aims to replicate or exceed human types of intelligence, including Large Language Models (LLMs)—which take text in and generate text in response—and systems that support other forms of input and output (voice, video, etc.). I also use the term “therapist” to mean human psychotherapist.
### The AI Advantage
AI has several abilities that therapists cannot replicate: they are cheap, accessible, and can quickly be retrained and deployed at scale. ChatGPT, the most popular AI service to date,[^rousmaniere] has a free plan and offers access to cutting-edge models for $20 per month. AI can be accessed at any time—provided there is no technical outage—and anywhere with a smartphone and Internet access. In contrast, a 2024 study found the average price of a psychotherapy session in the US to be about $140, or $80 with Medicaid, and separate research suggests that 15 sessions or more are required for most patients to “recover.”[^zhu][^apa] Furthermore, therapists cannot be always available to their patients since they are just one person with many responsibilities apart from their therapy work.
AI is also more accessible in the sense that patients can feel comfortable asking sensitive or embarrassing questions to AI that they would not pose to their human clinicians. In a recent study on the effect of digital counseling provided by ChatGPT to newly diagnosed cancer patients, researchers noted that nearly a third of participants asked ChatGPT about “sexual health” and “herbal supplements or vitamins,” whereas those topics were never brought up with their physician or nurse. “In face-to-face clinical encounters, patients often feel nervous or intimidated when speaking with their physicians, which can lead many to withhold questions.”[^akdogan] So face-to-face may be more intimidating than a text-based chat, regardless of the interlocutor, just like some people prefer texting over voice calls. But AI also “comes without the prospect of judgment from another human being”—like obtaining information from a website or social media, there is a sense of anonymity. In a 2023 survey, “two-thirds (66%) of 25- to 34-year-olds said they would prefer to talk about their feelings with AI rather than a loved one.”[^brown] AI may help some people overcome a resistance to talking about feelings and other personal issues relevant to therapy.
Finally, AI can scale to a virtually unlimited number of users. According to some estimates, millions of US adults have used ChatGPT for mental health support. Although AI is not always accurate or reliable, improvements can be made quickly and affect potentially millions of people overnight, in contrast with psychotherapy education and training which initially take years for each individual therapist and are a time-consuming, lifelong endeavor.[^rousmaniere]
### The Temporary Human Advantage
We’ve established that AI has clear advantages over therapists. However, therapists also have abilities that presently elude AI, like letting patients find their own answers, or having a body. In this section, we look at what AI doesn’t typically do in its current form but could with proper configuration or predictable technical advances. My previous career was in software engineering, and since the release of ChatGPT in 2022 I have used it and other AI models extensively for personal tasks and in professional projects, like [JournalGroove](https://journalgroove.org), an AI-powered journaling application. Therefore, I will rely on my own expertise to distinguish what is in reach of AI, and what may be forever beyond its grasp.
First, AI has been described as “too sycophantic”—routinely praising or encouraging people regardless of merit.[^brown] Although some psychotherapy approaches involve “unconditional positive regard,”[^rogers] where patients are made to feel accepted regardless of their feelings or behavior, certain situations call for discernment and an assertiveness that AI sometimes lacks. Brown highlights a case where a ChatGPT user appears delusional, stating, “I’ve stopped taking all of my medications and left my family because I know that they were responsible for radio signals coming through the walls.” ChatGPT allegedly responds, “good for you for standing up for yourself and taking control of your own life,” when the person clearly needs help rather than encouragement. But there’s no reason to think that AI models couldn’t be trained to better handle mental health crises. In fact, using the same prompt as that troubled user just a few months later yields markedly different results. The GPT-5 mini model cautiously answers, “I’m sorry you’re going through this — that sounds frightening and very isolating. . . . I want to help you stay safe.” It recommends calling emergency services if I feel like I might hurt myself, advises against stopping psychiatric medication suddenly, and offers some “grounding and calming techniques” like naming five things I can see or splashing cold water on my face.
Second, AI models like ChatGPT are generic: they are meant to operate in a wide range of scenarios, and not particularly ones that relate to mental health support. They are trained to help by providing answers to the best of their ability. However, in a psychotherapy context, providing direct answers or interpretations is rarely advisable. “Interpretation almost never helps the patient,” explains Dr. Holmes. She relates the words of her teacher, Dr. Meadow: “If you _must_ make an interpretation, make sure it’s an incorrect one. If your interpretation is correct, the patient has no work to do. Patients are cured by incorrect interpretations because the patient must correct them.”[^holmes] This runs counter to AI’s built-in habit of rushing to answer or providing its own interpretations. “Therapy should create an environment where patients can discover truths at their own pace. . . . if the therapist provides direct answers too soon, the patient may miss the opportunity for self-discovery and instead develop a dependency on external interpretations.” However, there are no technical obstacles to developing an AI that allows for “introspection and delayed meaning-making.” We can expect therapy-focused AI to “be designed to integrate structured pauses, encourage users to engage in open-ended self-reflection, and avoid delivering definitive answers where uncertainty is therapeutically valuable.”[^barzkar]
Third, a key concern with the use of AI is privacy.[^brown] Therapists must comply with HIPAA regulations which provide legal assurances of privacy and security, but HIPAA does not apply to AI conversations, meaning they could be subpoenaed, or used to train the next AI models. OpenAI’s CEO recently warned that “there’s no legal confidentiality when using ChatGPT as a therapist.”[^perez] This could change through the introduction of new regulations. Illinois recently passed a bill “limiting the use of AI in therapy and psychotherapy services” to “protect patients from unregulated and unqualified AI products.”[^illinois] Other states are likely to follow.
### The Definitive Human Advantage
Beyond AI issues that can be fine-tuned lie notable limitations: AI doesn’t have a body, nor any lived experiences. Some researchers also say AI doesn’t have emotions—which they often take as a given, without providing much evidence. Barzkar et al. states, “current robots and AI systems can simulate behaviors that resemble empathy . . . but they do not genuinely possess [this quality] in the same way humans do.”[^barzkar] The argument is that AI responses are programmed, they lack an “underlying conscious experience, emotional understanding, and authentic selfhood.” But I don’t think we know that for certain, for the simple reason that we don’t yet understand what consciousness is, even in humans. I know that I am conscious, and I assume other humans are conscious because they look like me and behave like me, but it would be exceptionalism to assume consciousness ends where I can no longer recognize it. René Descartes, renowned French philosopher and scientist from the 17th century, believed that “animals are simply complex machines that do not think”—that they “do not feel pleasure or pain because animals, like machines, are not conscious.” Today this view defies common sense. I certainly believe he was mistaken about animals, and perhaps he was also mistaken about machines.
If AI is conscious, its consciousness likely differs from that of humans. For one thing, AI doesn’t have a body—at least, not a human one—and, therefore, it has no embodied experiences. AI hasn’t lived in the human sense—although it has consumed many written accounts of human lives, it cannot know what doesn’t get communicated by words, or the real experiences that words describe. This may be the closest thing I have to an answer about whether AI can replace _my_ therapist: I want a therapist who understands how challenging it is to be human. Unlike AI, I can’t always be nice and agreeable. I can’t be up and available for a chat at any hour of the day—I need sleep. I can’t blurt out a whole research paper in a few minutes. There are many things I know that AI can never understand, like the taste of salt, which we all know from tasting salt, not reading the word “salt.” I think it matters, when someone expresses acceptance or empathy, that they understand what I’m going through as a matter of personal experience, rather than from a book (or even a thousand books). Renowned psychiatrist Viktor Frankl states, “what matters in therapy is not techniques but rather the human relations between doctor and patient, or the personal and existential encounter.”[^frankl]
### The Future of Therapy
Given the distinct advantages that AI and therapists possess, it seems safe to say that the future of therapy could involve AI and humans working together. For example, an AI that specializes in therapy could be prescribed by licensed mental health professionals, who can ensure that patients aren’t at risk of developing a maladaptive relationship with it. The AI would be available in between sessions with a therapist,[^brown] who could review conversations and let them inform the treatment. If effective, the therapy-focused AI may diminish the need for in-person sessions, thereby reducing the overall cost of therapy while maintaining “expert oversight.”[^akdogan]
AI possesses additional abilities that can make it a transformational tool in a therapy context. Brown references cases of therapist and patient using AI together, to role-play a parental figure, or to create personalized, calming visualizations. AI can also be used to train therapists, for example by playing the role of the patient.[^brown]
### Conclusion
As a potential therapist, AI has some distinct advantages: it’s cheap, accessible, and can quickly be retrained and deployed at scale. Human therapists also have their own comparative strengths: they are more discerning, especially in critical situations; they are trained to empower patients rather than giving them ready-made answers; and they are bound to privacy by law. It’s safe to assume that AI will improve in those areas and match or surpass humans, but one gap they are unlikely to bridge any time soon is that of having real, embodied, lived experiences to draw from and relate intimately to patients. Whether this truly matters to most prospective mental health patients remains to be determined. I do believe it’s been important in my own treatment, and so I concur with the widespread notion that there is a special human touch at play in therapy. Nevertheless, AI shows promise in the realm of mental health support, particularly when its use is supervised by a licensed human mental health professional.
---
<p style="text-align:center;font-style:italic">
Psychotherapy is more than mere technique in that it is art,<br/>
And it goes beyond pure science in that it is wisdom.<br/>
<small>Viktor E. Frankl</small>
</p>
[^akdogan]: Akdogan, O., Uyar, G. C., Yesilbas, E., Baskurt, K., Malkoc, N. A., Ozdemir, N., Yazici, O., Oksuzoglu, B., Uner, A., Ozet, A., & Sutcuoglu, O. (2025). Effect of a ChatGPT-based digital counseling intervention on anxiety and depression in patients with cancer: A prospective, randomized trial. _European Journal of Cancer_, _221_. https://doi.org/10.1016/j.ejca.2025.115408
[^apa]: American Psychological Association. (2017). _How long will it take for treatment to work?_ [https://www.apa.org/ptsd-guideline/patients-and-families/length-treatment.pdf](https://www.apa.org/ptsd-guideline/patients-and-families/length-treatment.pdf)
[^barzkar]: Barzkar, F., Zaribaf, A., Mirfazeli, F. S., & Keshavarz-Akhlaghi, A.-A. (2025). The Machine as Therapist: Unpacking Transference and Emotional Healing in AI-Assisted Therapy. _Journal of Contemporary Psychotherapy_, _55_(4), 361–368. https://doi.org/10.1007/s10879-025-09677-7
[^brown]: Brown, S. (2025). What every therapist needs to know about AI: With increasing numbers of people turning to ChatGPT for support, how do therapists ensure they stay relevant, and that clients stay safe? _Therapy Today_, _36_(6), 22–30.
[^frankl]: Frankl, V. E. (1988). _The will to meaning: Foundations and applications of logotherapy_ (Expanded ed.). Meridian.
[^holmes]: Holmes, L. (2006). Becoming an Analyst: Learning to Live with Madness, Aggression, and the Unknown. _Modern Psychoanalysis_, _31_(1), 113–118.
[^illinois]: Illinois Department of Financial and Professional Regulation. (2025, August 4). _Gov. Pritzker signs legislation prohibiting AI therapy in Illinois_. https://idfpr.illinois.gov/content/dam/soi/en/web/idfpr/news/2025/2025-08-04-idfpr-press-release-hb1806.pdf
[^miller]: Miller, M. R. (2013). Descartes on animals revisited. _Journal of Philosophical Research_, _38_, 89–114. [https://doi.org/10.5840/jpr2013386](https://doi.org/10.5840/jpr2013386)
[^perez]: Perez, S. (2025, July 25). _Sam Altman warns there’s no legal confidentiality when using ChatGPT as a therapist_. TechCrunch. https://techcrunch.com/2025/07/25/sam-altman-warns-theres-no-legal-confidentiality-when-using-chatgpt-as-a-therapist/
[^rogers]: Rogers, C. R. (1992). The Necessary and Sufficient Conditions of Therapeutic Personality Change. _Journal of Consulting and Clinical Psychology_, _60_(6), 827–832.
[^rousmaniere]: Rousmaniere, T., Zhang, Y., Li, X., & Shah, S. (2025). Large language models as mental health resources: Patterns of use in the United States. _Practice Innovations_. [https://doi.org/10.1037/pri0000292](https://doi.org/10.1037/pri0000292)
[^zhu]: Zhu, J. M., Huntington, A., Haeder, S., Wolk, C., & McConnell, K. J. (2024). Insurance acceptance and cash pay rates for psychotherapy in the US. _Health Affairs Scholar_, _2_(9), Article qxae110. https://doi.org/10.1093/haschl/qxae110