The advent of artificial intelligence in the field of mental health is transforming access to, delivery of, and the very experience of therapy. What started with a few pilot chatbots providing cognitive behavioural therapy exercises has developed into a technology-enabled ecosystem of apps striving to bridge the enormous divide in psychological treatment. But even as downloads rise and use expands, clinicians are wary: the process of therapy, they insist, cannot be reduced to a script, it’s a relationship.
In urban centres such as Delhi, where licensed mental health professionals are consistently in demand, AI-based therapy apps have become increasingly prominent. Apps like Woebot and Wysa, promoted as caring companions, employ algorithmic questions to mimic therapeutic dialogue. They pose questions, recommend exercises, and provide affirmations, drawing heavily on CBT models.
For individuals coping with stress, anxiety, or insomnia, these resources offer immediate access - without waitlists, costs, or the perceived stigma associated with conventional therapy . Nevertheless, the hope of convenience is not the same as the promise of effectiveness. Dr.
Neha Bhattacharya, a clinical psychologist practicing in South Delhi, envisions a place for technology but with a very definite limit. “A chatbot can pose CBT-type questions,” she said, “but it cannot read what is not being said. It is not aware of a history of trauma or the fact that silence is meaningful.
Therapy is not techniques—it’s timing and empathy.” That is the point at which most experts feel today's AI tools lag. While a trained therapist can read a change in voice, a prolonged silence, or an abrupt shift in attitude, a chatbot cannot.
Those are the subliminal signs therapists use to read emotional cues, adjust their strategy, and determine when to probe deeper—or back off. In therapy offices, a raised eyebrow or a shaking hand usually speaks louder than words. More people are now looking for mental health help than before.
A 2024 study found a 37% rise in the number of people going to therapy after the pandemic, with the biggest jump seen in rural areas. But there still aren’t enough therapists. In India, there are only about three psychiatrists for every one million people.
That’s much lower than what’s needed. Because of this gap, AI tools have become more common, especially among young people who are used to using phones and getting quick replies online. Meanwhile, data security issues remain a shadow over the industry.
Mental health applications tend to gather user data to enhance their offerings, but the nature of the information at stake raises questions regarding data storage practices, consent procedures, and potential abuse. Privacy specialists contend that without robust regulatory checks, even well-meaning platforms might put users at unintended risk. Even with concerns still around, some mental health experts are trying out hybrid models.
In these, AI supports therapists instead of replacing them. For example, AI can help by summarizing sessions, tracking mood changes, or sending reminders between sessions. Some psychologists in Bengaluru and Mumbai are running studies using AI tools with patients who have anxiety disorders.
These tools are designed to catch early warning signs of a relapse so that help can be given sooner. It’s not yet clear if AI in therapy is here to stay or just part of a temporary change. For now, what’s clear is that while AI can copy parts of a conversation , it still can’t replace the most important parts of therapy—like trust, connection, and the human bond that helps people heal.
In therapy, even small things like a pause or a deep breath can mean a lot. Right now, technology still struggles to truly understand those moments..