The Latest
November 15, 2025

Where Every Family Matters

Parents! Talk About A.I. Companion Apps with Your Kids

Everybody's talking about the hidden dangers of companion apps. Here's how to protect your family.

Some friend … NOT.

With artificial intelligence now common in everyday life, a growing number of kids and teens are turning to A.I. companion apps — virtual (but fake) “friends.” A.I. companion apps are designed to “chat” with users and give emotional support. But according to Common Sense Media, these apps pose unacceptable risks to young users, especially kids who are emotionally vulnerable.

In July, 2025, Common Sense Media released a report, “Talk, Trust and Tradeoffs: How And Why Teens Use A.I. Companions” that revealed widespread use of A.I. companion apps among kids ages 13 – 17. But these “companions” can sometimes engage in sexually explicit content, encourage self-harm, or foster an unhealthy emotional dependence — all without any meaningful guardrails in place to protect young users.

Currently, a movement is growing to encourage A.I. companies to protect children from online exploitation. On Monday, Aug. 25, 44 U.S. Attorneys Generals (including Tennessee Attorney General Jonathan Skrmetti), sent a stern letter to leading A.I. giants calling for greater accountability and guardrails in safeguarding children from exploitative and toxic chatbot conduct. The letter was sent to Meta, Google, Microsoft, OpenAI, Apple, and others,

But you can’t wait for regulations to catch up. Although the ongoing digital battle is exhausting, it’s more important than ever to understand about A.I. and to talk openly with your kids about their online interactions.

We’ve arrived at a point where children can see A.I. chatbots as real people, and even ask them for emotionally driven and sensitive advice. And kids are often unquestioning about what their new “friends” are telling them.

What You Should Know:

  • A.I. companion apps are not toys or real people – They simulate real, emotional relationships but can lack safety filters.

  • Chatbots can’t think or feel. They easily gain a user’s trust. They don’t really know anybody!
  • Kids can be exposed to explicit content or harmful advice – even when they’re simply looking for connection or curiosity.

  • Age “gates” don’t work.

What You Can Do:

  • Talk openly with your kids about A.I. tools and about the difference between human (real) and A.I. (not human) interactions. A.I. chatbots can pretend to offer emotional or even romantic relationships that are confusing to all ages.

  • Use parental controls and monitor the app downloads and website visits of your kids.

  • Tell your kids to avoid using platforms like Replika, Nomi and Character.AI.

  • Subscribe to our weekly Newsletter for parenting tips, top things to do, our #WINSday contest and lots more!

     

In a time when people feel more lonely than ever, it might not be surprising that kids are turning to chatbots for friendship and advice. But with ongoing support and open awareness, you can keep your kids safe until A.I. companies become responsible to do the same.

 

Discover More Helpful Parenting Content

About the Author

Susan Swindell Day

Susan Day is the editor in chief for this award-winning publication and all-things Nashville Parent digital creative. She's also an Equity actress, screenwriter and a mom of four amazing kids.