Introduction
Scroll through your social media feed for a moment, and you will likely encounter a familiar yet uncanny sight. You see faces you know—friends, colleagues, perhaps even your own—but they have been radically transformed. The skin is porcelain, the jawlines sharp, the lighting cinematic. These are not photographs; they are the output of advanced AI image generators like Lensa or Midjourney. Millions of users are voluntarily feeding their biometric data into these algorithms to receive a “perfected” version of themselves.
On the surface, this seems like harmless fun. However, as psychology students and future clinicians, we must look deeper. Is this trend merely a technological novelty, or is it a modern manifestation of what Donald Winnicott (1960) described as the False Self? More importantly, are we witnessing the emergence of a new, digital defence mechanism against the vulnerabilities of the modern ego?
Beyond DSM-5: A Psychodynamic Perspective
To understand the psychology behind AI avatars, we must first look at how the psychiatric establishment defines narcissism. If we rely solely on the DSM-5, we might dismiss this trend as simple vanity. The DSM-5 defines Narcissistic Personality Disorder largely through external behaviours: grandiosity and a need for admiration. Through this lens, a person posting AI portraits is simply “showing off.”
However, this definition is often too superficial for clinical work. This is where the Psychodynamic Diagnostic Manual (PDM-2), edited by Vittorio Lingiardi and Nancy McWilliams (2017), offers a crucial correction. The PDM-2 shifts the focus from external symptoms to internal experience. It suggests that the core of narcissistic personality organisation is not an excess of self-love, but rather a profound sense of emptiness, shame, and fragility. The grandiose exterior is merely a shield.
When viewed through the PDM-2 lens, the AI avatar trend takes on a melancholic meaning. It is not an act of arrogance; it is an attempt to soothe an internal sense of inadequacy. The user is essentially saying, “Please see this perfect version of me, so you don’t see the flawed, anxious person I feel like inside.”
Splitting And The Idealised Self-Object
The creation of these images involves a psychological transaction that mirrors the dynamics of splitting. To generate an avatar, a user uploads 10 to 20 “real” photos—selfies showing pores, asymmetry, and fatigue. In handing these over to the AI to be “fixed,” the user unconsciously performs an act of devaluation toward their real self. They admit to the machine that their biological reality is insufficient.
In exchange, the algorithm functions as an idealised self-object, a concept from Heinz Kohut (1971). It returns images where the user is stripped of all human frailty. The resulting Digital False Self is a compliant, invulnerable persona presented to the world to hide the True Self. Kohut spoke of the child’s need for the “gleam in the mother’s eye”—the mirroring that confirms we are valuable. In the digital age, the algorithm provides a synthetic version of this gleam. It mirrors us not as we are, but as we desperately wish to be.
Identity Diffusion And The Beautiful Mask
Since we cannot display these images here, let us perform a mental comparison. Imagine a split screen. On the left, a candid photograph of a person. You see micro-expressions: a droop in the eyelid suggesting tiredness, or tension in the jaw. This face tells a human story.
Now, imagine the AI-generated version on the right. The features are symmetrical, the lighting dramatic. But something is missing. The AI removes the ambiguity of human emotion. The avatar looks confident or stoic, but lacks vulnerability. By erasing imperfections, the AI also erases the capacity for genuine connection. We are left with a beautiful mask that invites admiration but deflects intimacy. This aligns with Otto Kernberg’s (1975) concept of identity diffusion, where the integration of the self fails, leaving only a superficial image.
Body Alienation And Digital Dysmorphia
This mechanism carries significant risks. The danger lies in the gap between the digital ideal and physical reality. As we interact through these avatars, we risk becoming alienated from our own bodies.
We are already seeing phenomena like “Snapchat Dysmorphia,” where patients seek cosmetic procedures to look like their filtered selfies. The AI avatar takes this further, creating a standard of beauty that is biologically impossible. When a person bonds with their Digital False Self, their actual reflection becomes a source of distress. The body becomes an obstacle to be edited away, rather than the home of the self.
Clinical Implications: Understanding Digital Defences
The rise of AI self-portraits is a digital laboratory for human psychology. It highlights our collective hunger for validation and fear of vulnerability. For mental health professionals, understanding this is not optional.
We cannot dismiss these behaviours as superficial. As Prof. Lingiardi emphasises in his work on the therapeutic alliance, understanding a patient requires understanding their defences.
Today, those defences are digital. If we want to connect with the True Self of our future clients, we must first understand why they built such elaborate digital masks. The task of the modern psychologist is to help the patient realise that while the AI avatar may be perfect, it is the flawed, vulnerable human face that is capable of being truly loved.
References
Kernberg, O. F. (1975). Borderline conditions and pathological narcissism. Jason Aronson.
Kohut, H. (1971). The analysis of the self. International Universities Press.
Lingiardi, V., & McWilliams, N. (Eds.). (2017). Psychodynamic diagnostic manual, second edition (PDM-2). Guilford Publications.
Winnicott, D. W. (1960). Ego distortion in terms of true and false self. The Maturational Processes and the Facilitating Environment, 140–152.

