Study: Teens are turning to AI companions for friendship

  • 72% of teens report having used AI companions at least once
  • Surveyed teens say AI is 'always there' and doesn't 'judge'
  • Experts concerned AI dependence could impact ability to connect

NOW PLAYING

Want to see more of NewsNation? Get 24/7 fact-based news coverage with the NewsNation app or add NewsNation as a preferred source on Google!

(NewsNation) — From homework to entertainment and now friendship, a new study has found artificial intelligence is playing an increasingly significant role in the lives of American teens.

A recent survey of more than 1,000 teens conducted by Common Sense Media found that 72% have used AI companions like Character.ai or ChatGPT at least once. More than half reported using AI companions at least a few times a month, and roughly 13% reported using them daily.  

Respondents reported using AI companions for a range of purposes, such as entertainment, curiosity around technology, emotional support, decision-making advice and coping with loneliness. They also said they found the AI companion interactions to be nonjudgmental and “always available when I need someone to talk to.”

Of those surveyed, nearly a third said they found AI conversations as satisfying or more satisfying than human conversations. In addition, 33% said they had chosen to discuss “important or serious matters” with AI companions instead of real people, and 12% percent reported being able to tell AI companions things they “wouldn’t tell my friends or family.”

Experts have warned that dependence on AI companions for social and emotional support can come at a cost, threatening children’s ability to form genuine connections.

“Adolescence is a very important time that they’re developing their social skills and their attachment abilities — and AI doesn’t present an opportunity for learning either of those things,” said Gail Saltz, a psychiatry professor at Cornell University.

While the study found that most teens still prefer real-life friendships, it recommended a series of actions tech companies, educators and parents can take to protect children.

For tech companies, suggestions included upgrading safety features like real-age assurance systems, usage limits and crisis intervention systems.

Educators and parents were encouraged to establish clear policies around AI companion use and recognize concerning changes in behavior, as well as explain to children the “difference between AI validation and genuine human feedback.”

Some platforms promoting the technology, such as KaiAi, post reminders to users that AI companions should not replace human therapists or mentors.

AI

Copyright 2026 Nexstar Broadcasting, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

AUTO TEST CUSTOM HTML 20260112181412