š§ Chat with Caution? Youth Are Telling AI Their Darkest Secrets š¶āš«ļøš¤
- MediaFx

- Aug 4
- 2 min read
TL;DR:AI therapy apps are trending š§āāļøābut are your secrets really safe? š¤ As Indian youth pour their hearts into bots, questions loom over data leaks, ethics, and exploitation.

Theyāre lonely, anxious, or heartbroken āand instead of a friend, they text an AI.
From WysaĀ to Replika, Indiaās Gen Z is quietly turning to chatbots for mental health help. These AI ātherapistsā promise zero judgment, instant replies, and 24x7 availability. Sounds ideal, right? But here's the scary twistāyour private pain might not be so private after all
š² Whatās really happening?
A Scroll.in investigation (3 Aug 2025) found that mental health bots in India often operate in a legal grey zone . Apps like Wysa, Woebot, and MindDocĀ say they care about your privacyābut most have vague, jargon-heavy policies that few users read.
And unlike your real-life therapist, these bots aren't bound by Indiaās Mental Healthcare ActĀ
āI told it about my suicidal thoughts. It replied with a breathing exercise,ā said a 20-year-old Hyderabad college student.āLater, I got targeted ads for therapy plans I couldnāt afford.ā š²
š° Flashback: Why AI therapy boomed?
Post-COVID anxiety š·, long waitlists for real therapists, and rising awareness about mental health meant youth needed a quick fix. AI chatbots filled that gap, offering feel-good convos at the tap of a button.
And the numbers speak:š Wysa alone claims over 6 million users globally, many from India. Replika and Youper saw downloads double in urban India since 2023.
āļø Who gains, who loses?
Big win for startups and investorsāmental health apps are now a $4 billion global industryĀ (Forbes, 2024).But Indiaās youth, especially those from low-income or small-town backgrounds, may be risking their mental privacyĀ without knowing it
Why? Because these apps often store chats, analyze emotion patterns, and mayĀ share it with partners āfor improvement.ā Which can mean marketing agencies, insurers, or worseĀ (depending on the fine print)
š§āš¾ So what for the rest of us?
Your data = product.College bro venting about anxiety? Auto-driver's daughter confessing loneliness? All that vulnerable content could train AI modelsĀ or be monetized
Thereās no clear Indian law yetĀ to govern AI therapy bots.And no certification to prove which chatbot is actually safe, ethical, or effective.
If AI messes upāwhoās accountable? š¤·š½āāļøCan bots even understand regional trauma, caste stress, or gender bias? š³ļøāš
āApps may miss red flags. A human therapist would never,ā says Mumbai psychologist Dr. S Kiran.
š§µ MediaFx Take:
Indiaās youth need access to affordable, real, non-judgy therapyĀ š§āāļøš¬AI can helpābut not without regulation. Until the govt steps up with digital mental health rules, be cautious what you confess to a botĀ
Use these apps for mood tracking or habit tips. But for suicidal thoughts, abuse trauma, or severe anxietyātalk to a trained, human therapist. Or dial a helpline.
In the rush to feel better, donāt let AI become your emotional landlordĀ













































