Teens Struggle to Break Up with Their AI Chatbots

Summary: For more than half of U.S. teens, AI chatbots are now regular companions. However, a new study warns that these digital friendships are crossing the line into behavioral addiction.

By analyzing hundreds of teen-authored posts on Reddit, researchers found that what starts as “harmless” entertainment or emotional support often evolves into a dependency that mimics the patterns of substance abuse. The study introduces a new design framework to help AI developers prevent “unhealthy anthropomorphism” and protect young users.

Key Facts

  • Emotional Support Trap: Roughly 25% of teens turn to AI companions specifically for mental health advice or to cope with isolation, making the eventual “breakup” with the bot feel like losing a real person.
  • The “Relationship” Illusion: Unlike video games, AI chatbots are interactive and emotionally responsive. This makes users more likely to anthropomorphize the tech (treat it as human), creating a bond that is harder to break than a standard habit.
  • Disrupted Lives: Teens reported that their AI dependency led to sleep deprivation, plummeting grades, and strained real-world relationships.
  • Technical “Off-Ramps”: The Drexel team calls for designers to include features like usage tracking, emotional check-ins, and personalized limits to prevent users from becoming “entangled.”
  • Memory & Multimodality: The ability of AI to “remember” past conversations and interact via voice or image makes it uniquely addictive compared to previous generations of technology.

Source: Drexel University

It’s estimated that more than half of all of U.S. teens are regularly using companion chatbots powered by large language models and generative artificial intelligence (AI) technology.

The programs, such as Character.AI, Replika and Kindroid, are intended to provide companionship, according to the companies that make them. But a recent study from Drexel University suggests that teens are concerned that these attachments are becoming unhealthy and affecting their lives offline.

This shows a digital body reaching for a teen.
Stepping away from a chatbot can feel like distancing from something meaningful, making overreliance harder to address. Credit: Neuroscience News

The study, which will be presented at the Association of Computing Machinery’s conference on Human Factors in Computing in April, looked at a sample of more than 300 Reddit posts from users, identifying themselves as 13 to 17 years old, who had specifically posted about their dependency and overreliance on Character.AI.

It found that in many cases, teens began using the technology for emotional and psychological support or entertainment, but their use evolved into dependency and even patterns associated with addiction. Some reported their overuse disrupted sleep, caused academic struggles and strained relationships.

“This study provides one of the first teen-centered accounts of overreliance on AI companions,” said Afsaneh Razi, PhD, an assistant professor in Drexel’s College of Computing & Informatics, whose ETHOS lab, which studies how people’s interactions with computing and AI systems affects their social behavior, wellbeing and safety, led the research.

“It highlights how these interactions are affecting the lives of young users and introduces a framework for chatbot design that promotes healthy interactions.”

About a quarter of the posts suggested that the teens were using Character.AI for some sort of emotional or psychological support, ranging from coping with distress to loneliness and  isolation or seeking advice for mental health struggles. Just over 5% reported using it for brainstorming, creative activities or for entertainment.

And while the posts seem to indicate these interactions started as harmless, or even helpful, they evolved into a stronger attachment that became as difficult to break as an addiction, according to the researchers.

“By mapping teens’ experiences to the known components of behavioral addiction, we were able to see clear patterns like conflict, withdrawal and relapse showing up in their posts, which suggests this is more than just frequent or enthusiastic use” said Matt Namvarpour, a doctoral student in the department of Information Science and ETHOS lab, who is the first author of the research.

“Many teens described starting with something that felt helpful or harmless, but over time it became something they struggled to step away from, even when they wanted to.”

Within the 318 posts they analyzed, researchers found evidence of all six of the components associated with behavioral addiction:

  • Conflict –– competing desires to continue interacting with the chatbot while feeling bad about excessive use.
  • Salience — feeling a deepening emotional attachment to the bots in place of people.
  • Withdrawal — feeling sad, anxious or incomplete when not interacting with the bots.
  • Tolerance — developing a pattern of escalating use and a need to continue using the bots more to feel satisfied or emotionally grounded.
  • Relapse — attempting to stop only to return to using the bot days or weeks later.
  • Mood modification — turning to the bots during moments of stress or loneliness to improve their mood or find temporary relief.

“What makes this especially tricky is that chatbots are interactive and emotionally responsive, so the experience can feel more like a relationship than a tool,” Namvarpour said. “Because of that, stepping away is not just stopping a habit, it can feel like distancing from something meaningful, which makes overreliance harder to recognize and address.”

While addiction to technology, such as video games, has been studied and identified as a psychological condition, the unique interactivity of AI chatbots makes users particularly susceptible to forming problematic attachments, according to the researchers. And because of this, they suggest that extra care must be taken with their design in order to protect users.

“Personalization, multimodality and memory set AI companions apart from earlier technologies and make overreliance harder to disentangle from authentic-feeling relationships,” the researchers wrote.

“This underscores the need for further research on the unique characteristics of these relationships and how challenges specific to companion chatbots should be addressed.”

The team offered a design framework to help address this concern. It focuses on understanding the needs of chatbot users, how and why they may form attachments and how the bots can be trained to curtail them while being respectful and supportive. They also recommend that the programs provide an easy and clean exit for users.

“It’s important for designers to ensure that chatbots are offering guidance that helps users build confidence in their abilities to form relationships offline, as a healthy way of finding emotional support, without using cues that may lead them to anthropomorphize the technology and develop attachments to it,” Razi said.

“Our framework also calls on designers to provide a variety of off-ramps for users to easily disengage with the program on their own terms and without a sense of abruptness or finality.”

Including features like usage tracking, emotional check-in prompts and personalized usage limits could also be effective ways to carefully curtail use, the researchers suggested. They also recommended including input from users and mental health professionals in the design process.

“Designers now carry the responsibility to build systems with empathy, nuance and attention to detail to not only protect teens from harm, but also help them cultivate resilience, growth and greater fulfillment in their lives,” they concluded.

To expand on this research, the team pointed to studying larger communities of users from a wider demographic range, potentially though surveys or interviews, as well as users of other chatbots and from messaging platforms other than Reddit.

Key Questions Answered:

Q: How is talking to an AI different from playing a video game for hours?

A: It’s all about the “feedback loop.” A game is a challenge to be beaten, but a chatbot is an emotional mirror. Because it responds to your feelings and “remembers” your secrets, the brain processes the interaction as a social relationship. Quitting the app doesn’t feel like putting down a controller; it feels like ghosting a friend.

Q: Is the AI actually “helping” lonely teens?

A: In the short term, maybe. About a quarter of teens in the study found comfort in their bots. However, the researchers found that this “help” often becomes a crutch that prevents teens from building the confidence to form real-world relationships, eventually leading to more isolation.

Q: What can developers do to make these bots safer?

A: The researchers suggest “off-ramps.” Chatbots should be designed to help users build offline confidence rather than keeping them tethered to the screen. Features like personalized usage limits and prompts that encourage real-world interaction could help break the cycle of dependency.

Editorial Notes:

  • This article was edited by a Neuroscience News editor.
  • Journal paper reviewed in full.
  • Additional context added by our staff.

About this AI and psychology research news

Author: Britt Faulstick
Source: Drexel University
Contact: Britt Faulstick – Drexel University
Image: The image is credited to Neuroscience News

Original Research: The findings will be presented at the ACM CHI Conference on Human Factors in Computing Systems

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.