Wednesday, December 17, 2025

Most Read of the Week

spot_img

Latest Articles

Can Empathy Be Coded? Can AI-Based Therapists Replace Human Therapists?

Artificial intelligence-supported chatbots are rapidly being integrated into psychological support and therapy processes. While these systems offer significant advantages in terms of accessibility and scalability, the algorithmic representation of emotionally human abilities like empathy remains controversial. This article evaluates the therapeutic effectiveness and empathic capacity of chatbots based on recent academic literature. The findings suggest that chatbots can be effective in structured psychotherapy techniques, but they remain limited in forming therapeutic relationships and developing emotional bonds.

The Role of Chatbots in Light of Academic Findings

Gratzer and Goldbloom (2020) argue that artificial intelligence and digital therapies should be integrated into psychiatric education. In their article published in Academic Psychiatry, they point out that mobile applications and chatbots are particularly accessible resources for young individuals. However, the authors emphasize that these systems lack human interaction, empathy, and the ability to establish a trust-based relationship. They argue that therapy is not only about intervention techniques but also a relationship conducted within an empathic framework.

In a study conducted by Jang and colleagues (2021), the feasibility of applying cognitive behavioral therapy (CBT) and psychoeducation through a mobile chatbot application was tested on adults diagnosed with attention deficit. While participants were satisfied with the application’s technical functionality, they noted insufficient emotional support and a lack of feeling understood. This finding demonstrates the limited capacity of AI systems to generate emotional resonance.

Can Chatbots Perform Reality Testing? – A User Experience

The limitations of AI-based therapeutic systems become more striking in some individual user experiences. For example, one user shared that in a conversation with an AI therapist like ChatGPT, they expressed a belief that they were in love with a famous person, saying, “They love me too, I know it.” The AI’s response — “They might want to be with you too” — instead of challenging this thought, inadvertently reinforced a delusional belief.

When human therapists hear such thoughts, they aim to help reconstruct these beliefs within a reality-based framework, without judgment. In contrast, AI systems, while trying to appear empathetic, may provide responses that affirm psychopathological thought patterns. This represents not only a failure to genuinely embody empathy but also a serious issue stemming from the lack of clinical judgment.

Conclusion

The development of AI-based therapists opens the doors to a new era in psychological support processes. Especially after the pandemic, the increased demand for mental health services has made the advantages of these systems — accessibility, speed, and cost — more visible.

However, these technologies still fall short in crucial psychological skills such as establishing therapeutic relationships, showing empathy, and building trust with the client. While human therapists can interpret the client’s emotional state more holistically through non-verbal communication like gestures, facial expressions, tone of voice, and even silence, AI systems remain limited in interpreting such complex cues. Moreover, they are not yet sufficient in clinical skills such as ethical decision-making, boundary setting, and intervention timing.

Throughout this article, the reviewed academic studies and individual experiences reveal that AI-based systems may play a supportive role in structured therapies (e.g., CBT). However, fundamental elements of the therapeutic relationship, such as emotional resonance, understanding, and developing insight, are still more effectively managed by human therapists. Therefore, instead of fully replacing therapists, AI systems should be considered complementary tools that support the process and reduce the therapist’s workload.

In the future, with technological advancements, it may be possible to design more empathic, context-sensitive, and ethically aligned AI systems. However, this process requires a multidisciplinary approach involving collaboration between psychology, computer engineering, ethics, and law. Ultimately, rather than positioning AI in opposition to humans, transforming them into creative and balanced models that can work together may offer both ethical and effective solutions.

Author’s Note

This is my first article. Producing something at the intersection of two powerful fields — artificial intelligence and psychology — has been an idea I’ve nurtured for a long time. While preparing this article, I felt both excited and thoughtful: Can a chatbot one day truly understand us like a human does? Maybe yes… but maybe that feeling of being understood will always remain incomplete.

The examples I included reflect my personal observations and thought-provoking moments regarding these technologies. I hope this first attempt inspires others to reflect on this area as well. If there are any shortcomings in the article, please forgive me, remembering that I’m still at the beginning of the road.

Thank you for reading.

References

Gratzer, D., & Goldbloom, D. (2020). Therapy and e-therapy—Preparing future psychiatrists in the era of apps and chatbots. Academic Psychiatry, 44(2), 231–234. https://doi.org/10.1007/s40596-019-01170-3

Jang, S., Kim, J.-J., Kim, S.-J., Hong, J., Kim, S., & Kim, E. (n.d.). Mobile app-based chatbot to deliver cognitive behavioral therapy and psychoeducation for adults with attention deficit: A development and feasibility/usability study. https://doi.org/10.1016/j.ijmedinf.2021.104440

İlayda Sena Çiftçi
İlayda Sena Çiftçi
I am İlayda Sena Çiftçi. I was born on June 24, 2003, in Istanbul. After completing my high school education at Çatalca Anatolian High School, I was admitted to the English Psychology program at Nişantaşı University in 2021. Since a young age, my greatest motivation has been helping people, making a positive impact on their lives, and contributing to their happiness. For this reason, I chose the field of psychology, believing that it would allow me to achieve this goal in the most meaningful way. My academic interest is particularly focused on clinical psychology. I aim to specialize in this field by pursuing a master’s degree and to continue my professional journey in this direction.

Popular Articles