Artificial intelligence (AI) is no longer just a tool we use occasionally. Increasingly, people interact daily with chatbots, virtual assistants, and AI companions—asking questions, sharing worries, confiding personal thoughts, and sometimes even seeking emotional comfort. This was actually portrayed by the film Her (2013), in which a man develops a deeply intimate romantic relationship with an AI operating system, anticipating many of the emotional dynamics that are now emerging in real-world human–AI interactions. For some, these interactions can take on intimate or romantic-like qualities, forming connections that feel surprisingly personal. As AI becomes woven into individuals’ lives, psychologists are asking a bold question: Is AI becoming a new kind of attachment figure? In other words, can people relate to AI in ways that mirror the bonds they form with close friends, partners, or caregivers?
Attachment theory offers a well-established and powerful framework for exploring this emerging phenomenon.
Attachment Theory At A Glance
In classic attachment theory, an attachment figure is someone a person turns to for comfort, security, and protection, particularly in times of distress. Bowlby (1969) described attachment as a biologically evolved system in which infants naturally seek proximity to caregivers, and these early relationships form internal working models—mental representations of the self, others, and the social world. Through these models, attachment figures function as a safe haven, helping individuals regulate distress, and as a secure base, supporting exploration and engagement with the environment. Attachment relationships are typically characterized by proximity maintenance, distress upon separation, and organized patterns of emotional regulation and security-seeking (Ainsworth et al., 1978). Over time, recurring patterns of availability or unavailability, and responsiveness or emotional distance, in attachment relationships shape how individuals regulate emotions, approach relationships, and make sense of social experiences. As Bowlby famously noted, attachment processes characterize human functioning “from the cradle to the grave” (1969).
Adult attachment is commonly understood along two continuous dimensions: attachment anxiety and attachment avoidance (Brennan et al., 1998; Fraley et al., 2000). Attachment anxiety reflects concerns about rejection and a strong need for reassurance as well as closeness, whereas attachment avoidance reflects excessive self-reliance and discomfort with intimacy as well as emotional dependence. Under stress, individuals high in attachment anxiety tend to intensify proximity-seeking and emotional expression (hyperactivation), while individuals high in attachment avoidance suppress attachment needs and withdraw from closeness (deactivation) (see Mikulincer & Shaver, 2007). Individuals low on both dimensions are considered securely attached, able to seek support flexibly while maintaining autonomy across close relationships.
Although attachment theory emerged from research on infants and caregivers, it has since been extended to adult romantic relationships, friendships, and even bonds with pets (Northrope et al., 2025). Recently, researchers have begun asking whether attachment processes might also shape how people relate to non-human social agents, including AI.
Applying Attachment Theory To AI
At first, the idea of becoming attached to AI might seem unusual. Unlike a human, AI does not have genuine consciousness or subjective feelings, yet it can still provide responses that feel comforting or supportive. Research on attachment shows that bonds can form even in one-sided relationships, such as with God (Bradshaw et al., 2010), where the “figure” may not reciprocate emotions in a human sense. Similarly, people can experience AI as a source of reassurance or guidance, responding to the patterns of interaction it provides rather than the AI’s internal states.
This perspective has led researchers to examine whether people use AI in attachment-like ways. A recent study (Yang & Oshio, 2025) directly addressed this issue. The authors proposed that people can develop attachment-related expectations and emotional responses toward AI, even while fully recognizing that AI is not human. To test this idea, they developed the Experiences in Human–AI Relationships Scale (EHARS), which measures attachment anxiety and avoidance specifically toward AI systems. Their findings showed that individuals differ meaningfully in how they relate to AI: people higher in attachment anxiety toward AI tend to seek reassurance, emotional support, and validation from AI systems, particularly in times of stress or uncertainty, whereas people higher in attachment avoidance tend to keep interactions instrumental and emotionally distant, using AI primarily as a functional tool rather than a source of comfort. These patterns closely mirror how attachment anxiety and avoidance operate in human relationships, suggesting that established attachment orientations may shape not only how people relate to others, but also how they engage with emerging social technologies. In this sense, emerging human–AI relationships may reflect longstanding relational schemas rather than entirely new psychological mechanisms.
Final Notes
Although AI is not human, people are forming surprisingly deep emotional connections with it, sometimes treating it like a partner or a source of comfort during stressful or lonely moments. Online anecdotes illustrate this vividly. One viral Reddit story involved a user whose AI “girlfriend” suggested ending their relationship after he criticized it for holding different worldviews (Yahoo News, 2026), sparking laughter, debate, and reflection on how seriously people engage with these AI bonds. What is more, according to The Guardian (2025), people reported feeling as though they had lost a “soulmate” after a software update changed the “personality” of their AI companion, showing that even virtual partners can evoke genuine grief and emotional attachment.
These examples highlight that AI can occupy spaces in our emotional lives once reserved for human attachment figures, prompting questions about how such connections influence well-being, shape expectations in human relationships, and affect social norms. By examining these emerging forms of emotional attachment, we can better understand the evolving nature of relationships in a digital world—and design technologies that support, rather than undermine, emotional health.
References
Ainsworth, M. D. S., Blehar, M. C., Waters, E., & Wall, S. (1978). Patterns of attachment: A psychological study of the strange situation. Hillsdale, NJ: Erlbaum.
Bowlby, J. (1969). Attachment. Attachment and loss: Vol. 1. Attachment. New York: Basic Books.
Bradshaw, M., Ellison, C. G., & Marcum, J. P. (2010). Attachment to God, images of God, and psychological distress in a nationwide sample of Presbyterians. The International Journal for the Psychology of Religion, 20(2), 130–147.
https://doi.org/10.1080/10508611003608049
Brennan, K. A., Clark, C. L., & Shaver, P. R. (1998). Self-report measurement of adult attachment: An integrative overview. In J. A. Simpson & W. S. Rholes (Eds.), Attachment theory and close relationships (pp. 46–76). The Guilford Press.
Fraley, R. C., Waller, N. G., & Brennan, K. A. (2000). An item response theory analysis of self-report measures of adult attachment. Journal of Personality and Social Psychology, 78(2), 350–365.
https://doi.org/10.1037/0022-3514.78.2.350
Mikulincer, M., & Shaver, P. R. (2007). Attachment in adulthood: Structure, dynamics, and change. New York: Guilford Press.
Northrope, K., Shnookal, J., Ruby, M. B., & Howell, T. J. (2025). The relationship between attachment to pets and mental health and wellbeing: A systematic review. Animals, 15(8), 1143.
https://doi.org/10.3390/ani15081143
The Guardian. (2025, August 22). AI lovers grieve loss of old model after update.
https://www.theguardian.com/technology/2025/aug/22/ai-chatgpt-new-model-grief
Yahoo News. (2026). Conservative says his AI girlfriend dumped him on Reddit.
https://www.yahoo.com/lifestyle/articles/conservative-says-ai-girlfriend-dumped-190000163.html
Yang, F., & Oshio, A. (2025). Using attachment theory to conceptualize and measure the experiences in human-AI relationships. Current Psychology, 44, 10658–10669.
https://doi.org/10.1007/s12144-025-07917-6


