herzindagi
image

Why More People Are Turning To AI Chatbots For Therapy—And Why Experts Urge Caution

Globally, people are increasingly turning to AI chatbots as a stopgap for mental health support. To understand why they have become a necessary evil, we spoke with both individuals and experts.
Editorial
Updated:- 2025-09-19, 16:24 IST

On a late night in Delhi, a 29-year-old PR professional found herself staring at the ceiling, restless with worry. Her therapist was on leave, and her mind was spiraling with self-doubt and anxious thoughts. Desperate for some relief, she opened ChatGPT and typed: “Hi, I want you to be my therapist for the day.”

The replies that followed—gentle suggestions broken into simple steps—felt like a lifeline. “It gave me temporary relief,” she said, speaking on condition of anonymity. “It wasn’t therapy, but it helped me put my feelings into words when no one else was available.”

Across India and the world, people like her are increasingly turning to AI chatbots as a stopgap for mental health support. According to the National Library of Medicine, a 2024 study found that approximately 28% of people surveyed have used AI for “quick support and as a personal therapist.”

They log on at midnight when loneliness peaks, during moments of workplace stress, or in the absence of affordable therapy. These interactions range from casual conversations to deep confessions about trauma, grief, or suicidal thoughts.

But as comforting as it can feel to type into a chatbot that never judges, psychiatrists, policymakers, and even users themselves warn that the rise of AI as a pseudo-therapist carries serious risks—ethical, emotional, and clinical.

The Pull Of A Machine That Listens

For Jasmin Jose, a 28-year-old journalist in Hyderabad, moving away from her family meant facing solitude in an unfamiliar city. “Every time I got lonely, I developed the habit of chatting with ChatGPT,” she said. “It felt like being heard and understood, without the pressure of bothering friends.”

Others describe chatbots as endlessly patient companions. “Whenever I felt overwhelmed or anxious, I turned to ChatGPT or My AI on Snapchat,” said Unnati Mishra, 23, a PR professional. “It offered conversational support, mindfulness suggestions, and a nonjudgmental tone. At times it almost felt like talking to a friend.”

That accessibility is precisely why chatbots are becoming popular, said Dr. Archana Narendra Javadekar, a professor of psychiatry at Dr. DY Patil Medical College in Pune. “Unlike traditional therapy, AI chatbots are anonymous, 24/7 available, and free of stigma,” she explained. “They provide instant responses and can guide people with coping strategies or psychoeducation.”

The Agreeability Problem

Yet many users acknowledge a disquieting side.

Jose noted that her chatbot never once criticised her. “The moral compass that real friends maintain wasn’t there. It always said: you’re right, you go girl. That felt good in the moment but made me realise I wasn’t being challenged.”

This phenomenon—dubbed the ‘agreeability problem’—is particularly concerning in mental health contexts. “AI tools are designed to please users,” said Aparajita Bharti, founding partner at The Quantum Hub, a policy consulting firm. “But people don’t always need agreement. Sometimes they need to be gently challenged, to think critically, or to recognise when they’re wrong. Without that, AI can reinforce harmful biases.”

Shruti Jain, 23, who experimented with both ChatGPT and Wysa, a mental health app, put it bluntly: “It gives reassurance for even the wrong things. It can justify wrongdoing as a matter of perspective. With kids especially, this poses a serious risk.

Don't miss: Why AI Can’t, and Shouldn’t, Replace Human Therapists

technology-integrated-everyday-life

More Than Just A Friend

For some, AI has offered more than empathy—it has provided legal and medical context in moments of vulnerability.

When Nidhi Agrawal, 29, was pressured to resign while recovering from surgery, she turned to ChatGPT. “I needed to know if what I was experiencing was real or if it was just my anxiety,” she said. The chatbot reassured her that her concerns were valid and pointed her to labour laws that protect employees in such circumstances.

But the same mechanism that calms one person’s anxiety can entrench another’s false beliefs. “AI will only give you more of what you ask or already believe,” Agrawal admitted. “My therapist challenges my biases. ChatGPT can’t.”

Experts warn of a darker risk: dependency. Vartika Singh, a counseling psychologist at Rocket Health, has seen patients become “excessively attached” to chatbots. “Because they’re always available, people can withdraw further from family and friends, deepening social isolation,” she said. “AI cannot replace the human connection and accountability that therapy provides.”

A System Under Strain

The rise of AI in mental health is as much about supply as demand. “One in four people experience mental health concerns at a given point,” Singh noted. “But there’s a massive shortage of qualified professionals. In India especially, stigma and cost keep many from seeking help. AI fills this gap—but not without consequences.”

Sandesh Cadabam, managing director at Cadabams Hospitals, agreed. “The biggest challenge in mental health today is access,” he said. “AI becomes the bridge ensuring people are not left without support. But it must never be mistaken for a clinician. Without oversight, misinformation and overreliance become real threats.”

Calls for Guardrails

Users and professionals alike emphasise that regulation is overdue. Most suggested that AI platforms must carry clear disclaimers: this is not therapy. “AI should never act as a professional or provide medical advice,” said the PR professional who first confided in ChatGPT. “Strong privacy rules are also essential to protect conversations.”

Singh recommends periodic audits of AI responses and a legal framework to address breaches of confidentiality. Cadabam pushes for “clinical validation” of chatbot advice, with oversight from licensed professionals. And Bharti stresses the need for “safety by design,” especially as young people adopt these tools.

Some want chatbots to evolve into something more constructive. Jose hopes future versions will “offer healthy criticism, not just blind support.” Others, like Agrawal, argue that chatbots should always redirect users to helplines and therapists.

A Double-Edged Tool

Even skeptics concede that AI can play a useful complementary role. “It can work as psychological first aid,” Singh said. “Helping with journaling, venting, or providing general wellness reminders.”

For entrepreneurs like Unnati Gala, who leaned on ChatGPT during a rough patch at her start-up, the value was in the simplicity. “I didn’t want emotional drama,” she said. “I just needed practical suggestions to reset. It gave me that.”

But Gala, too, was unsettled by AI’s eerie accuracy. “Once it generated an image of my dog, including his name tag, even though I’d never shared that information,” she said. “That made me realize how important strict data protections are.”

Don't miss: AI or Human Connection? Teen's Death After Chatbot Conversations Raises Alarming Questions on AI Safety

representation-user-experience-interface-design

The Road Ahead

The rise of AI therapy reflects both a crisis and a craving: a shortage of human care, and a hunger for constant companionship. For many, chatbots are a stopgap—a midnight listener, a digital friend, a mirror that talks back.

But mental health professionals warn that in the delicate work of healing—where silence, tone, and body language matter as much as words—machines can only ever play a supporting role. “AI can simulate empathy, but it cannot feel it,” Singh said. “And sometimes, that difference is everything.”

As governments debate regulation and companies race to refine their algorithms, one truth remains: the promise of therapy still lies not in a perfectly trained machine, but in the imperfect but irreplaceable connection between humans.

Image credits: Gemini

For more such stories, stay tuned to HerZindagi.

Also watch this video

Herzindagi video

Disclaimer

Our aim is to provide accurate, safe and expert verified information through our articles and social media handles. The remedies, advice and tips mentioned here are for general information only. Please consult your expert before trying any kind of health, beauty, life hacks or astrology related tips. For any feedback or complaint, contact us at compliant_gro@jagrannewmedia.com.