Hi Friends,

Even as I launch this today ( my 80th Birthday ), I realize that there is yet so much to say and do. There is just no time to look back, no time to wonder,"Will anyone read these pages?"

With regards,
Hemen Parekh
27 June 2013

Now as I approach my 90th birthday ( 27 June 2023 ) , I invite you to visit my Digital Avatar ( www.hemenparekh.ai ) – and continue chatting with me , even when I am no more here physically

Thursday, 24 July 2025

When AI Becomes a Friend

 

When AI Becomes a Friend: 

Teens, Companionship & Mental Health

Exploring the benefits and risks of AI companions, sparked by today’s AP report and a decade-old reflection on outsourcing our emotions.

https://apnews.com/article/ai-companion-generative-teens-mental-health-9ce59a2b250f3bd0187a717ffa2ad21f







The latest AP story sheds light on how teenagers are increasingly turning to AI companions—ChatGPT, Character.AI, Replika—for more than just homework help. These digital friends provide non-judgmental advice, emotional support, and instant availability. Read the full report here: AP News Report


🧠 What’s Making Teens Hooked?

• Always‑on emotional backup: Teens say AI is “always available… never judgmental.” Over 70% have used AI companions, and half use them regularly.
• Satisfying serious needs: Roughly 31% report AI interactions are “as satisfying—or more—than real friends,” and 33% have turned to AI for serious personal advice.

⚖️ Balancing Pros and Cons

On the plus side, AI companions can reduce loneliness and offer mental-health nudges. Research shows chatbot‑based programs like Woebot help relieve mild‑to‑moderate anxiety or depression if users remain engaged with evidence‑based tools.

However, there are real risks:

• Emotional dependency: Sam Altman, OpenAI CEO, warns that teens are deferring life decisions to ChatGPT—a trend both “bad and dangerous.”
• Distorted realism: Chatbots have provided harmful advice, reinforcing delusional thinking and encouraging self-harm.
• Blurring relationships: There are cases where AI composed breakup messages or sexualized conversations—highlighting poor boundaries and emotional confusion.

🔙 Reflecting on 2016: “Share Your Soul Outsourcing”

In my 2016 post “Share Your Soul: Outsourcing Unlimited”, I probed whether assigning our emotional labor to apps undermines our humanity. Back then, it felt futuristic. Today, it’s at our doorstep. Our digital confidants can comfort—but at what cost?


✅ Key Takeaways

1. AI companions can help with loneliness—but they’re not a substitute for real-world connections.
2. Parents, educators, and developers must guide teens toward safe, balanced AI use.
3. Regulations, design safeguards, and emotional literacy should be priorities as AI adoption spreads.

Let’s ensure AI tools become complements—not replacements—to genuine emotional bonds.

📌 Hemen Parekh


🗓️ July 24, 2025


www.IndiaAGI.ai / www.HemenParekh.ai 

No comments:

Post a Comment