top of page

🧠 Chat with Caution? Youth Are Telling AI Their Darkest Secrets šŸ˜¶ā€šŸŒ«ļøšŸ¤–

TL;DR:AI therapy apps are trending šŸ§˜ā€ā™€ļøā€”but are your secrets really safe? 🤐 As Indian youth pour their hearts into bots, questions loom over data leaks, ethics, and exploitation.

ree

They’re lonely, anxious, or heartbroken —and instead of a friend, they text an AI.

From WysaĀ to Replika, India’s Gen Z is quietly turning to chatbots for mental health help. These AI ā€œtherapistsā€ promise zero judgment, instant replies, and 24x7 availability. Sounds ideal, right? But here's the scary twist—your private pain might not be so private after all

šŸ“² What’s really happening?

A Scroll.in investigation (3 Aug 2025) found that mental health bots in India often operate in a legal grey zone . Apps like Wysa, Woebot, and MindDocĀ say they care about your privacy—but most have vague, jargon-heavy policies that few users read.

And unlike your real-life therapist, these bots aren't bound by India’s Mental Healthcare ActĀ 

ā€œI told it about my suicidal thoughts. It replied with a breathing exercise,ā€ said a 20-year-old Hyderabad college student.ā€œLater, I got targeted ads for therapy plans I couldn’t afford.ā€ šŸ“²

šŸ•° Flashback: Why AI therapy boomed?

Post-COVID anxiety 😷, long waitlists for real therapists, and rising awareness about mental health meant youth needed a quick fix. AI chatbots filled that gap, offering feel-good convos at the tap of a button.

And the numbers speak:šŸ“ˆ Wysa alone claims over 6 million users globally, many from India. Replika and Youper saw downloads double in urban India since 2023.

āš–ļø Who gains, who loses?

Big win for startups and investors—mental health apps are now a $4 billion global industryĀ (Forbes, 2024).But India’s youth, especially those from low-income or small-town backgrounds, may be risking their mental privacyĀ without knowing it

Why? Because these apps often store chats, analyze emotion patterns, and mayĀ share it with partners ā€œfor improvement.ā€ Which can mean marketing agencies, insurers, or worseĀ (depending on the fine print)

šŸ§‘ā€šŸŒ¾ So what for the rest of us?

Your data = product.College bro venting about anxiety? Auto-driver's daughter confessing loneliness? All that vulnerable content could train AI modelsĀ or be monetized

There’s no clear Indian law yetĀ to govern AI therapy bots.And no certification to prove which chatbot is actually safe, ethical, or effective.

If AI messes up—who’s accountable? šŸ¤·šŸ½ā€ā™‚ļøCan bots even understand regional trauma, caste stress, or gender bias? šŸ³ļøā€šŸŒˆ

ā€œApps may miss red flags. A human therapist would never,ā€ says Mumbai psychologist Dr. S Kiran.

🧵 MediaFx Take:

India’s youth need access to affordable, real, non-judgy therapyĀ šŸ§‘ā€āš•ļøšŸ’¬AI can help—but not without regulation. Until the govt steps up with digital mental health rules, be cautious what you confess to a botĀ 

Use these apps for mood tracking or habit tips. But for suicidal thoughts, abuse trauma, or severe anxiety—talk to a trained, human therapist. Or dial a helpline.

In the rush to feel better, don’t let AI become your emotional landlordĀ 

bottom of page