đ§ Chat with Caution? Youth Are Telling AI Their Darkest Secrets đ¶âđ«ïžđ€
- MediaFx

- Aug 4, 2025
- 2 min read
TL;DR:AI therapy apps are trending đ§ââïžâbut are your secrets really safe? đ€ As Indian youth pour their hearts into bots, questions loom over data leaks, ethics, and exploitation.

Theyâre lonely, anxious, or heartbroken âand instead of a friend, they text an AI.
From Wysa to Replika, Indiaâs Gen Z is quietly turning to chatbots for mental health help. These AI âtherapistsâ promise zero judgment, instant replies, and 24x7 availability. Sounds ideal, right? But here's the scary twistâyour private pain might not be so private after all
đČ Whatâs really happening?
A Scroll.in investigation (3 Aug 2025) found that mental health bots in India often operate in a legal grey zone . Apps like Wysa, Woebot, and MindDoc say they care about your privacyâbut most have vague, jargon-heavy policies that few users read.
And unlike your real-life therapist, these bots aren't bound by Indiaâs Mental Healthcare ActÂ
âI told it about my suicidal thoughts. It replied with a breathing exercise,â said a 20-year-old Hyderabad college student.âLater, I got targeted ads for therapy plans I couldnât afford.â đČ
đ° Flashback: Why AI therapy boomed?
Post-COVID anxiety đ·, long waitlists for real therapists, and rising awareness about mental health meant youth needed a quick fix. AI chatbots filled that gap, offering feel-good convos at the tap of a button.
And the numbers speak:đ Wysa alone claims over 6 million users globally, many from India. Replika and Youper saw downloads double in urban India since 2023.
âïž Who gains, who loses?
Big win for startups and investorsâmental health apps are now a $4 billion global industry (Forbes, 2024).But Indiaâs youth, especially those from low-income or small-town backgrounds, may be risking their mental privacy without knowing it
Why? Because these apps often store chats, analyze emotion patterns, and may share it with partners âfor improvement.â Which can mean marketing agencies, insurers, or worse (depending on the fine print)
đ§âđŸ So what for the rest of us?
Your data = product.College bro venting about anxiety? Auto-driver's daughter confessing loneliness? All that vulnerable content could train AI models or be monetized
Thereâs no clear Indian law yet to govern AI therapy bots.And no certification to prove which chatbot is actually safe, ethical, or effective.
If AI messes upâwhoâs accountable? đ€·đœââïžCan bots even understand regional trauma, caste stress, or gender bias? đłïžâđ
âApps may miss red flags. A human therapist would never,â says Mumbai psychologist Dr. S Kiran.
đ§” MediaFx Take:
Indiaâs youth need access to affordable, real, non-judgy therapy đ§ââïžđŹAI can helpâbut not without regulation. Until the govt steps up with digital mental health rules, be cautious what you confess to a botÂ
Use these apps for mood tracking or habit tips. But for suicidal thoughts, abuse trauma, or severe anxietyâtalk to a trained, human therapist. Or dial a helpline.
In the rush to feel better, donât let AI become your emotional landlordÂ




Comments