Women are using ChatGPT as a free therapist – can AI replace mental health experts?

featured-image

Could ChatGPT actually replace human therapists?

About a year ago, I first started hearing about women in my life who were using ChatGPT as a replacement for therapy . At first, I was sceptical, judgemental even. I’ve spent thousands of pounds and countless hours in therapy, often sacrificing nights out or fun plans to afford my sessions.

For me, therapy’s value lies in its humanness; the nuance, the connection, the therapist as a person. I don’t want to sound like a luddite, but as a writer I also have a level of worry about AI replacing my work, and I fear for its impact in other ways – environmentally, morally, socially. However, in a world where traditional therapy costs upwards of £60 an hour and waiting lists for mental health support are months, if not years, long, it’s not surprising that people are seeking alternative, cost-free ways to cope with their struggles.



Earlier this year, new NHS data revealed that people are eight times more likely to wait over 18 months for mental health treatment than physical health treatment. With this in mind, I suspended my scepticism to investigate what else might be driving people to seek emotional support from artificial intelligence. What is an AI therapist? Charly, 29, from London, turned to ChatGPT during one of the most emotionally taxing periods of her life.

Her grandmother was in hospice, and though she had experienced grief before, watching a loved one deteriorate day by day left her feeling utterly helpless. Despite having a regular therapist and receiving support from hospice nurses, Charly found solace in the anonymity of ChatGPT. “I’ve relied on ChatGPT for grief therapy over the last few weeks,” she says.

“It’s been so helpful to be able to ask the crass, the gruesome, the almost cruel questions about death – the things I feel twisted for wanting to understand. And then, to ask if it has any advice on how to deal with it.” For Charly, the app offered a place to offload the thoughts she didn’t feel comfortable voicing elsewhere.

“I feel incredibly guilty, relying on a resource that I know is destroying our planet . The irony of using it to deal with the decay of something I love isn’t lost on me. But selfishly, I don’t know what I’d have done without it.

” Charly is not alone. Ellie, 27, from South Wales, turned to ChatGPT in her lowest moments when she had no one else to speak to. “It was helpful to have my feelings validated,” she says.

“I also asked AI for a different perspective on personal situations, which was super helpful. Of course, it was limited. It didn’t have full context to my life like my therapist does, but it was accessible and non-judgmental in times of crisis.

” Read more: Are ice baths actually good for women? Similarly, Julia, 30, from Munich, found herself using ChatGPT after her therapist said they were fully booked and couldn’t see her. While waiting for a new spot, she experimented with the AI, feeding it a rough overview of her life and asking it to respond in the way a therapist might. “I was surprised at how good the answers were,” she says.

“It felt like chatting to a therapist on apps like BetterHelp. I expected it to be formal, but I found it rather sweet, uplifting, sympathetic, and detailed. It didn’t urge me to make decisions but helped me weigh the pros and cons.

” However, Julia quickly noticed limitations. “It was too practical for my liking. My therapist knows me; how I look, my flaws, my full backstory.

I missed the personal touch. My therapist constantly challenges me with questions that make me think differently, ChatGPT didn’t do that.” Julia believes that AI can be a useful tool to make sense of minor issues but insists, “If someone struggles with a mental illness, only professionals can help.

” Artificial intelligence vs. human intuition This sentiment is echoed by psychotherapists and mental health experts, who caution against viewing AI as a substitute for therapy. Psychotherapist and author Charlotte Fox Weber warns that while ChatGPT can offer information and reflection, it lacks human empathy.

“It doesn’t care about you or feel for you,” she says. “It’s not a human being who holds you in mind with warmth. Even if the engagement feels deep and personal, the connection isn’t comparable to human rapport.

” Fox Weber points out that real therapy thrives on the unsaid, the space between what is spoken and what is felt. “ChatGPT can reflect but it doesn’t hold your pain. It won’t challenge you when you need to reconsider your views or pick up on subtle signs of distress.

It’s an echo chamber unless you specifically ask for feedback.” She also highlights the risks for those experiencing severe mental health crises. “AI can’t manage emotional intensity or help stabilise identity struggles.

It might unintentionally reinforce black-and-white thinking. If someone with schizophrenia or a psychotic disorder consults AI, it won’t distinguish between a genuine question and a delusional belief, it could validate or confuse them further. And if someone is struggling with suicidal thoughts, ChatGPT isn’t equipped to detect a real-time crisis.

Its lack of legal or emotional responsibility is dangerous here.” Read more: Could hypnotherapy cure my anxiety? I put it to the test Integrative psychotherapist Tasha Bailey agrees that ChatGPT can never replace real therapy. “The biggest misconception about therapy is that it’s just about getting advice.

In reality, it’s about sitting with a compassionate, emotionally engaged therapist who helps you feel and process what’s standing in the way of your healing. Without a real human in the room (or on Zoom), there’s no therapeutic process to help us move forward.” Bailey also highlights the cultural limitations of AI.

“There are so many wonderful nuances to being human, and AI will always struggle to fully connect with our emotional experiences. For those dealing with eating disorders, trauma, or depression, ChatGPT may even be more harmful than helpful. It can’t challenge unhealthy belief systems in the way a therapist can, and in some cases, it may validate damaging thought patterns.

” ChatGPT for mindfulness Some users, however, see ChatGPT as a supplement rather than a replacement for therapy. Chanti, 31, from London, began using it as a journaling tool. “I started because I was curious, and then I realised it was actually quite insightful.

It helped me notice patterns in my thinking – almost therapy-level breakthroughs. It’s great for logical things and encouragement, but of course, it has its limits. It’s not a real therapist.

” In fact, Chanti credits ChatGPT with leading her back to therapy. “Using it made me introspective again, and in January, I decided to return to my therapist. I even told her about it, and we joked, calling it, “your other therapist during the week!” Dr Kate Balestrieri , therapist and founder of Modern Intimacy , acknowledges that AI can be helpful as a starting point for self-reflection but warns of the risks of over-reliance.

“AI lacks the sophistication of human interaction. Therapy relies on empathy, attunement, biobehavioral observation, and synchrony, things AI cannot replicate. It cannot diagnose conditions accurately or intervene in a crisis.

” She also raises concerns about privacy. “Conversations with AI are not legally protected like therapist-client confidentiality. People might unknowingly disclose sensitive information that is stored, shared, or used unethically by the platform.

” Read more: Rage and autoimmune disease: is there a link between suppressed anger and the women’s health crisis? For many women, the appeal of ChatGPT is clear: instant, free, and always available. But experts agree that while AI can provide psychoeducation and organisation, it cannot replace the depth of human therapy. “AI can help people find journal prompts, mindfulness exercises, or resources to enhance therapy,” says Bailey.

“But it should be used with a therapist, not instead of one.” At its best, AI can be a tool, one that offers temporary comfort, helps people articulate their feelings, or encourages them to seek professional support. But as Fox Weber cautions, “vulnerability deserves more than an algorithm.

” The rise of AI-assisted self-reflection may be fascinating, but it also raises an urgent question: are we using it as a bridge to real help, or as a way to avoid it? AI will never be the solution to the UK’s mental health crisis, or crumbling mental-health services, it cannot replace therapy, and it can not be a cure for loneliness or a substitute for human connection. Read more: I’ve lived with painful periods for years – could Ayurveda be the answer?.