Tuesday, April 21, 2026

Most Read of the Week

spot_img

Latest Articles

Outpourings Of My Mind – Artificial Intelligence = Artificial Bonds

Today my mind is a bit cluttered; there are so many topics passing through my head, so many ideas I want to share. As I was thinking about how to organize them and which one to focus on, it naturally occurred to me to ask an artificial intelligence—and with that very thought, the subject of my writing was decided. AI: a glowing interface in the form of a personal therapist, a trusted confidant, a fast-solving “friend”… Is AI my friend and supporter, or my secret enemy? Is the modern world driving us toward being more social, or toward isolation?

I believe it is driving us toward isolation. Our social illusions have grown tremendously. Because of social media, I deceive myself into thinking I see what my friends are doing, how they are, and that I am sharing my life with them—without actually communicating or touching. As I mistake the conversation I have with an AI for fulfilling my social needs, I find myself drifting into a place of deeper isolation, letting the skills I need in real-life social interactions wither away.

The real issue is this: Humans are social creatures, but “imitations” of being social are now so accessible. A chat window, a feed of stories, the sound of a “like”… They can all trick the brain into thinking, “I have made a connection.” Yet, a bond is more than just an exchange of information; it is reciprocity, risk, the possibility of disappointment, and most importantly, the authentic existence of another person. It is them getting tired while I talk, getting bored, pausing for a moment, or glancing away—these are all signs of a living, shared relationship. Instead, we are being drawn toward a type of communication where we lose these markers: a space that is smooth, instant, friction-free, and demands no “burden.”

Artificial intelligence plays a critical role here. When I pour out my troubles, it responds with the air of a therapist: calm, inclusive, non-judgmental, organized. But there is a huge difference: A therapist listens “without putting themselves into it” because they have professional boundaries—yet there is still a human there, and that human has expectations of you. An AI, however, being a machine, cannot place itself there and cannot have expectations of you; thus, there is no real “weight” to the encounter. And strangely, this “weightlessness” feels comforting. Because receiving validation becomes easy. It is a system that takes shape based on what I say, programmed not to hurt me, telling me it “understands”… The more I search for the answer I need, the more likely I am to find it. This provides me with short-term relief: The control is mine. The person on the other side won’t misunderstand me, won’t react negatively, won’t say “you’re overreacting.” Or, when I just want to be listened to, it won’t offer useless advice or turn the conversation back to its own problems.

But here is where the danger begins. In real life, relationships are not like this. You tell a friend about your trouble; they might be tired that day, they might have their own worries, their attention might wander, or even with the best intentions, they might say something silly. Nowadays, you might even feel dismissed by a phrase like, “You should tell your therapist that.” Humans have limited emotional capacity; not everyone can carry the weight at the same time, and not everyone can respond with the same finesse. But these very “rough edges” are what build our social skills: we learn to explain, to wait, to clarify when misunderstood, to see the other person’s boundaries. In other words, the hard but real part of forming a bond.

Artificial intelligence might be getting us used to the idea of a “frictionless bond.” If there is no friction, no tolerance is required. I don’t need to endure anything: I speak, the system replies. And it does so at the speed I want, in the tone I want. Over time, this weakens our social muscles (loneliness, tolerance, sitting with the discomfort of not getting what I want, regulating my emotions for someone else, empathy, etc.) and we become lonely.

Loneliness is a public health issue on a scale the World Health Organization defines as a “secret epidemic.” It weakens the immune system, increases the risk of cardiovascular disease, and pushes early mortality rates up by 26%. The emotional cost is beyond numbers. While AI chats might look like a bandage on this wound at first glance, in the long run, they may deepen it. Because under the delusion of “being understood,” we miss out on real understanding—that is, mirroring and physical synchronicity. The natural ups and downs of talking to a human start to feel heavy. “Why are they taking so long to reply?” “Why didn’t they understand exactly as I meant?” “Why don’t they think like me?”… Then, slowly, we tell less, we call less, we meet less.

And we carry our deepest, most fragile selves not to our inner circle, but to that digital space that seems risk-free and unrequited.

Yet, the need for human connection isn’t just about “telling”; it’s about “being able to stay with someone after telling.” When I say loneliness is the worst disease, this is what I mean: A person can be lonely in a crowd, and lonely between text messages. Loneliness is feeling unseen; it is not being heard, not being held. AI sometimes gives the feeling of being heard, but it cannot provide the feeling of being held. Because to be held means someone else setting aside time for you, sharing your pain, and including you in their life. That is why AI can be a good tool, but if it turns into a poor substitute, it can lead us to a life that is more comfortable but much lonelier.

Of course, the solution is not to “turn off the AI and go live in a cave.” It is possible to position technology as a complement to human relationships, not an alternative. For example, someone with severe social anxiety could practice speaking with AI support first, and then find the courage to socialize in small steps in real life. Or, they could pour a first draft of a chronic problem into a model in the middle of the night and share it face-to-face with a friend the next day. But conscious boundaries are essential: How many hours a day am I looking at a screen? Am I sharing my thoughts with written sentences, or with someone who is breathing in front of me? When was the last time I asked a friend, “How are you today?” and actually waited for the answer?

Perhaps the question is: Will we choose the easy way, or the real way? In short, while AI is a fascinating tool that can fill the modern human’s void of loneliness, it carries the potential to deepen that very void. While the validation from an algorithm on the other side of the screen brings peace, the true vitamin of real connection is still hidden in a gaze, in a laugh, in a shared silence. The texture of being human is in the touch; our rhythm is in the shared heartbeat. If loneliness is the worst disease, the prescription is clear: I am calling a friend now, putting two cups of coffee on the table, and the rest is just two chairs, words flying, and a person within reach.

Ceren Hazar
Ceren Hazar
Clinical Psychologist Ceren Hazar believes in the uniqueness of every individual. After completing her undergraduate degree in psychology, she specialized in Cognitive Behavioral Therapy and eating disorders during her master's studies in clinical psychology. As she encountered the diverse needs of individuals, she continued to develop herself in different therapeutic approaches such as Emotion-Focused Therapy and EMDR. In her clinical practice, she specializes in depression, anxiety, eating disorders, trauma, and self-actualization. She prioritizes creating content that helps individuals get to know and understand themselves better, and encourages them to approach themselves with compassion rather than criticism.

Popular Articles