Thursday, November 13, 2025

Most Read of the Week

spot_img

Latest Articles

How Far Are We from Replacing Therapists with Artificial Intelligence?

In the digital age, loneliness has also become automated. Chatbots, accessible with just a tap, no longer just speak; they listen, respond, and sometimes even imitate therapy. But do they truly help, or have they simply become the new interface of isolation?

“I told ChatGPT about my troubles. It responded like a therapist. It felt like it understood me, but did it really understand?”

In the past, emotions were shared with another human. Today, they are shared with a screen and what stands before us is not a person anymore, but software. In recent years, conversation-based artificial intelligence applications, like ChatGPT, Replika, Woebot, have rapidly expanded in the field of emotional support. People do not just exchange information with these systems; they also share their emotions, fears, and inner conflicts. These “digital companions,” who listen without falling silent, respond 24/7, and never judge, have become a kind of therapeutic space for many. At this point, a key question emerges: Do these conversations genuinely provide psychological support? Or are we merely lingering in an illusion of algorithmic empathy?

Seeking answers to this question, researchers have conducted various experimental studies in recent years, and the findings are both striking and contradictory. AI-assisted systems appear to have the potential to offer psychological support in certain cases, but these effects are not universal and most importantly, these systems fall short when it comes to deepening the therapeutic process.

Alternative to Therapy or Just a Tool?

AI-based systems have emerged as a source of hope, especially for those unable to access one-on-one therapy. In a comprehensive study published in Turkey, Akkan and Ülker (2023) stated that such technologies could complement the counseling process but can never replace human therapists. According to them, these systems are “supportive but limited.”
Artificial intelligence may walk alongside you as a companion on the healing journey, but it cannot carry the road alone.

It Can Imitate Empathy, But Not Experience It

Chatbots can skillfully deliver supportive phrases, positive feedback, and even cognitive-behavioral techniques. Phrases like “I understand you” or “It’s perfectly normal to feel this way” may seem meaningful and reassuring at a textual level. However, one must remember this is just a linguistic simulation. Empathy is not just about finding the right words; it is about internalizing emotion and resonating with it synchronously.

A real therapist does not merely label emotion, they bear witness to it. Sometimes through silence, sometimes through a pause, or even through a gaze that carries presence. Artificial intelligence, on the other hand, cannot feel the emotion or share its burden. It only produces responses based on data patterns, which are functional yet empty echoes. The meaning of an empathic expression lies in who, when, and how it is said. The same sentence, when spoken by a human, can heal; when delivered by an algorithm, it might only seem “logical.”

Akkan and Ülker’s (2023) study is particularly cautionary here. According to them, “the inability to establish a therapeutic bond is one of the most significant limitations of technological tools” (p. 59). A bond is not just about communication; it requires mutual emotional investment. Chatbots can only offer a simulation of that investment. Furthermore, in human therapeutic processes, sometimes what remains unsaid conveys the most meaning. A therapist’s furrowed brow, a forward-leaning posture, and teary eyes are signals that artificial intelligence cannot reproduce, yet they are fundamental to healing. Thus, just because an AI can act like a therapist does not make it a therapeutic figure. These systems can only model empathy; they cannot live it.

What Do Real Experiments Say? Five Pieces of Evidence from Five Countries

The impact of artificial intelligence on therapy is no longer merely a theoretical discussion; it is now being scientifically measured. In the past five years, numerous randomized controlled trials worldwide have been conducted to observe the psychological effects of these technologies. The results call for both optimism and caution.

Jacobson et al. (2025) – USA / NEJM AI
In a striking study published in NEJM AI, 419 participants interacted weekly with an AI chatbot. The control group received only informational messages.
Result: The chatbot group showed significant short-term improvements in depression and anxiety. However, due to the lack of long-term relationship development, the effects did not persist.
“It provided emotional support but did not form a relationship.”

Klos et al. (2021) – USA / Stanford University
In this pilot study, university students interacted with the Woebot chatbot. After four weeks, significant improvements in depression and anxiety levels were observed. Interestingly, some students later felt that the chatbot conversations became artificial and predictable, suggesting that while useful in the short term, chatbots struggle to establish lasting emotional trust.

He, Zhang & Wang (2022) – China / Internet Interventions
This randomized six-week study showed that students with high emotional awareness benefited more from chatbot use, while those who were emotionally distant did not.
This finding suggests that therapeutic effect is linked not only to the system but also to the user’s self-capacity.

Sabour et al. (2022) – China / IEEE Access
This study assessed chatbots’ ability to provide empathic responses. Participants initially found interactions meaningful, but by week three, many noted the conversations became repetitive and “pre-programmed.” The inability to form emotional bonds led to a rapid decline in positive effects.

Akkan & Ülker (2023) – Turkey / Fırat University
Though not experimental, this theoretical and ethical analysis evaluated user expectations and concerns in Turkey. Most participants found the systems accessible but emphasized that they cannot substitute the depth of a human therapeutic relationship.

Privacy: Who Holds Our Most Intimate Words?

One of the least discussed but most critical risks of AI-based therapeutic systems is data privacy. Emotions shared by users are processed by algorithms under often opaque data policies.

A sentence written to an AI becomes a data object, and data today is an economic resource. Akkan and Ülker (2023) highlight that the data processing of user input lacks transparency. Strong international regulations on this matter are still pending. Opening up to an AI may also mean opening up to an invisible data economy.

Artificial Intelligence May Guide – But It Cannot Heal Like a Therapist

AI-powered systems can provide meaningful support in terms of accessibility. However, the depth of the therapeutic process still depends on human connection. Being understood in silence, felt through eye contact, or having someone bear witness to your pain are human qualities that artificial intelligence cannot simulate. Thus, AI should be seen not as a replacement for therapy, but as a complementary tool. A hybrid model may be the most realistic path forward for mental health systems of the future.

References

Akkan, G., & Ülker, S. V. (2023). Ruh sağlığı hizmetlerinde yapay zeka uygulamaları ve ilişkili teknolojiler. Fırat Üniversitesi Sosyal Bilimler Dergisi, 33(1), 55–64.
He, L., Zhang, Y., & Wang, Y. (2022). Using AI chatbots to provide self-help depression interventions for university students: A randomized controlled trial. Internet Interventions, 27, 100500. https://doi.org/10.1016/j.invent.2021.100500
Jacobson, N. C., Heinz, M. V., Mackin, D. M., et al. (2025). Randomized trial of a generative AI chatbot for mental health support. NEJM AI. https://ai.nejm.org/doi/full/10.1056/AIoa2400802
Klos, M. C., Escoredo, M., Joerin, A., Lemos, V. N., Rauws, M., & Bunge, E. L. (2021). Artificial intelligence–based chatbot for anxiety and depression in university students: Pilot randomized controlled trial. JMIR Formative Research, 5(8), e20678. https://doi.org/10.2196/20678
Sabour, S., Zhang, W., Xiao, X., Zhang, Y., Zheng, Y., Wen, J., Zhao, J., & Huang, M. (2022). Chatbots for mental health support: A scoping review. IEEE Access, 10, 70038–70051. https://doi.org/10.1109/ACCESS.2022.3188614

Melisa Balkandere
Melisa Balkandere
Melisa Balkandere is a clinical psychologist and a psychology writer with a focus on emotional depth. Her work centers on key psychological themes that shape one’s inner world, including body image, eating behaviors, and emotional regulation. She specializes in cognitive behavioral therapy, adult psychotherapy, and trauma-informed approaches. Balkandere approaches psychological concepts not only through an academic lens but also within cultural, societal, and emotional contexts. She writes not merely to inform, but to create a gentle space for the unspoken questions her readers carry. In her words, scientific clarity and human vulnerability stand side by side. “Some emotions only become lighter once they’re named.”

Popular Articles