Children’s reliance on AI chatbots raises alarm

NOW PLAYING

Want to see more of NewsNation? Get 24/7 fact-based news coverage with the NewsNation app or add NewsNation as a preferred source on Google!

(NewsNation) — Children and teenagers are forming intense emotional attachments to AI chatbots, sparking concern among parents and prompting new state legislation as experts warn of “companionship addiction” that could stunt social development.

The phenomenon has grown rapidly as AI companions provide constant availability and unquestioning agreement, creating what artificial intelligence expert Bryony Cole calls an “effortless” relationship that can become addictive, particularly for young people still developing social skills.

“Kids get really attached really fast, because the AI is always there. It’s always available, and it never disagrees with you,” Cole, host of “The Future of Sex” podcast, told NewNation. “That’s where the addiction and the emotional attachment comes into play.”

New laws require safety measures, self-harm monitoring in AI apps

New York joined California this week in enacting laws requiring AI companies to implement safety measures, including monitoring for self-harm discussions and periodic reminders that users are conversing with algorithms, not humans.

The Federal Trade Commission is investigating seven major tech companies over potential harms their chatbots may pose to children.

Some parents report discovering their children engaged in hypersexualized conversations with chatbots, while others claim AI has encouraged suicidal ideation in teenagers.

Cole said the concern extends beyond addiction to the loss of crucial relationship skills.

“What they’re missing out on is that real friction in relationships where you are challenged or it might feel uncomfortable,” she said. “That’s about learning empathy and communication and listening and patience.”

The apps have become so pervasive that “I Am Sober,” traditionally used to encourage sobriety over substance abuse, now includes an option to quit chatbots.

Kids form intense bonds with AI chatbots that never disagree

Experts say the responsibility lies with AI companies to design appropriate safeguards, such as age verification, content filtering for minors and regular disclosures about the nature of AI interactions.

“It’s important also that parents talk to their kids about this and help them understand what AI is versus what talking to a real human is,” Cole said.

The issue affects adults, as well, with some users reportedly proposing to their AI companions, but children remain particularly vulnerable due to their developmental stage.

Cole advocates for developing educational curricula around healthy AI use rather than relying solely on regulation.

“How do we navigate that and not be scared about it?” she asked. “Start to figure out what’s a way through to talk to people or questions I can ask when I use an AI to make sure I’m having a healthy relationship with it.”

Elizabeth Vargas Reports

Copyright 2026 Nexstar Broadcasting, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

AUTO TEST CUSTOM HTML 20260112181412