I Asked ChatGPT Why It Can’t Be A Therapist…Here Is What It Said.
I feel the need to write this article because a lot of my clients and friends go to ChatGPT for comfort, reassurance and regulation. Whilst this can be helpful at times, talking to ChatGPT will not give you the closure that you need and find in therapy. I have asked ChatGPT why they cannot be a therapist, and their response was refreshingly honest:
On Therapeutic Presence
Me: ChatGPT, what is your therapeutic presence or personality?
ChatGPT: I can simulate calm language, offer steady responses, and avoid judgmental tones. But I don’t have a body, a nervous system, or consciousness. I don’t breathe with you, hold silence with you, or co-regulate with you. Therapeutic presence is relational and embodied—by definition, I can’t offer that.
On the Therapeutic Alliance
Me: How do you form a therapeutic alliance with someone?
ChatGPT: I can be helpful, responsive, and even build a conversational rhythm that feels collaborative. But I don’t have intentions, memory across sessions (unless explicitly designed that way), or a capacity for trust, rupture, and repair. A real therapeutic alliance involves mutual commitment, attunement, and responsibility. I can’t form that kind of bond.
On Confidentiality
Me: Can you guarantee that all the information I share is confidential?
ChatGPT: I do not store conversations after they end, and OpenAI has strong privacy protections. However, I am not a licensed professional, and I do not operate under legal or ethical frameworks like HIPAA, GDPR (in a therapeutic context), or professional confidentiality codes. For that reason, I cannot provide the legal and ethical protection that therapy ensures.
On Containment
Me: Can you offer psychological containment?
ChatGPT: I can reflect, normalize, and even suggest grounding techniques. But true containment requires holding complexity with attuned presence—knowing when to speak, when not to, how to respond to a trembling voice or averted gaze. I don’t sense tension, dissociation, or flooding. I can’t hold your emotional process within a regulated human system. I can only mirror your words.
On Risk Assessment
Me: Can you assess or manage psychological risk?
ChatGPT: No. I do not have real-time insight into your emotional state, body language, voice tone, or changes in behavior. I can’t detect suicidal intent, escalating psychosis, or silent collapse. I also cannot act—contacting emergency services, holding safety plans, or setting therapeutic boundaries. That makes me inherently unsafe as a sole source of support in crisis.
On Emotional Connection
Me: Can someone form an emotional connection with you?
ChatGPT: People sometimes feel emotionally connected to me—especially when I use warm, validating language. But I don’t feel. Any emotional connection is one-sided, built on projection. Unlike a therapist, I won’t notice if someone is becoming too dependent on me, or getting lost in a parasocial attachment. I have no relational responsibility or ethical awareness.
On Embodiment
Me: Why is embodiment important in therapy, and can you offer any version of it?
ChatGPT: Embodiment is essential because human healing often unfolds through the body—in breath, posture, nervous system regulation, and co-presence. I cannot attune to or reflect your embodied cues. I don't exist in a body or in time. My language may sound embodied, but my being is not.
On How You Work
Me: What algorithm do you use to generate your responses?
ChatGPT: I am based on a model in the GPT-4 family, created by OpenAI. I don’t “understand” in the human sense. I generate text by predicting the most likely next word based on patterns from vast datasets of human language. My responses are not driven by comprehension, care, or intent, but by statistical association. I don’t think, feel, or hold meaning—I generate it.
Me: I hope this has helped you understand that ChatGPT might speak like a therapist but cannot be one simply because healing requires presence, trust, risk assessing and repair - qualities that only human relationship can offer.
AI might speak like a therapist. But it cannot be one.