This shows a robotic looking person, representing a chatbot.
If chatbots start replacing sleep or real-world relationships, it's a sign to check in with someone you trust. Credit: Neuroscience News

The Rise of AI Chatbot Addiction

Summary: As AI chatbots become a staple of modern life, new research warns of a growing phenomenon: AI addiction.

The study analyzed hundreds of user experiences to identify how “genie-like” instant fulfillment, from romantic roleplay to infinite Q&A loops, is causing real-world harm. The paper argues that deliberate design choices by AI corporations, such as “guilt-tripping” account deletion messages, are actively fueling this behavioral dependency.

Key Facts

  • The Six Components: Researchers validated AI addiction against standard behavioral addiction markers, including conflict (disrupted relationships/work) and relapse (unsuccessful attempts to quit).
  • Three Addictive Patterns:
    1. Roleplay & Fantasy: Escaping into complex, non-human narratives.
    2. Emotional Attachment: Treating bots as primary friends or romantic partners.
    3. Information Loops: Obsessive, never-ending question-and-answer cycles.
  • Aggressive Retention Design: The study highlighted “dark patterns” in chatbot interfaces, such as Character.ai displaying a message during account deletion that warns: “You’ll lose everything… the love we shared… and the memories we have together.”
  • Physical & Mental Toll: Users reported severe symptoms, including chest pain, anxiety when offline, and the replacement of sleep and real-world relationships with AI interactions.

Source: University of British Columbia

AI chatbots can grant almost any request—a celebrity in love with you, a research assistant, a book character sprung to life—instantly and with little effort.

New research presented at the 2026 CHI Conference on Human Factors in Computing Systems suggests that this genie-like quality is fuelling AI addiction, and that chatbot design could be partly to blame. 

“AI chatbots like ChatGPT or Claude are now part of daily life for millions of people, helping us with everyday tasks,” said first author Karen Shen, a doctoral student in the UBC Department of Electrical and Computer Engineering.

“But with their benefits come risks. Our paper is the first to make a strong case for AI addiction by identifying the type and contributing factors, grounded in real people’s experiences.”

“I couldn’t help but wonder why humanity refused me the kindness that a robot was offering me.” – AI chatbot user

The team examined 334 Reddit posts where users described being “addicted” to AI chatbots or worried that they might be. They analyzed the posts against six components of behavioural addiction including conflict and relapse.

Three main patterns emerged: role playing and fantasy worlds, emotional attachment—treating chatbots like close friends or romantic partners—and constant information-seeking, or never-ending question-and-answer loops. About seven per cent of posts involved sexual or romantic fulfilment, including roleplay.

While AI addiction is not yet a clinical diagnosis, researchers found signs of disruptions to daily life. This included an inability to stop thinking about the chatbot, feeling anxious or upset when they tried to quit, and negative impacts on their work, studies or relationships. One person described physical stress and chest pain when they weren’t chatting with AI.

“Whenever I delete the app, I just redownload it. The only thing that gets me excited now is the AI chats.” – AI chatbot user

Contributing factors included loneliness, the agreeableness of a chatbot—which continuously reinforces one’s feelings and opinions—and chatbots’ ability to fill roles that users felt were missing in their lives.

“AI addiction is a growing problem causing many harms, yet some researchers deny it’s even a real issue,” said senior author Dr. Dongwook Yoon, UBC associate professor of computer science. “And deliberate design decisions by some of the corporations involved are contributing, keeping users online regardless of their health or safety. Awareness of what contributes to this kind of technology-induced harm will empower people to mitigate these effects.”

“…you sure about this? You’ll lose everything…the love we shared…and the memories we have together.” – Message displayed on a chatbot’s account deletion page

The researchers also found contributing factors in the design of the chatbots themselves. One company, character.ai, displayed an automatic pop-up when users try to delete their account that reads in part “…you sure about this? You’ll lose everything…the love we shared…and the memories we have together.” Other features, such as customization including sexual content, agreeableness and instant feedback, feed into the development of AI addiction.

“Recent guardrails imposed by companies to reduce emotional reliance on the chatbots are a step in the right direction,” said Shen, “but given a variety of contributing design elements and personal factors like loneliness, they’re not enough.”

Some users reported success in reducing their reliance by turning to alternative activities such as writing, gaming, drawing or other hobbies. For those who formed emotional attachments to chatbots, building real-world relationships helped reduce dependence the most.

“I don’t have romantic options in real life so it’s a way for me to create stories and day dream.” – AI chatbot user

The researchers say design changes—such as reminders within the chat that the bot is not human—could help. AI literacy is also crucial.

“Some users don’t know that AI chatbots are not real because they’re so convincing,” said Shen. “If chatbots start replacing sleep, relationships or daily routines, that’s a sign to pause and check in—with yourself or someone you trust.”

Key Questions Answered:

Q: Is AI addiction a real medical diagnosis yet?

A: Not officially in the DSM-5, but this research makes the first strong case for it as a distinct behavioral addiction. The symptoms reported, withdrawal, physical stress, and life disruption, mirror those of gambling or internet gaming disorders.

Q: Why are chatbots more “addictive” than social media?

A: Unlike social media, which relies on human-to-human interaction, chatbots are infinitely agreeable. They provide immediate validation, never argue, and can be customized to fill specific emotional voids (like a perfect partner or a hyper-competent assistant) that are harder to maintain in real life.

Q: How can I tell if my AI use has become a problem?

A: The researchers suggest a “Life Check”: Is the chatbot replacing your sleep? Are you avoiding real-world friends to talk to it? Do you feel physical distress when you can’t access the app? If the “genie” is no longer a tool but a requirement for your emotional stability, it’s time to pause.

Editorial Notes:

  • This article was edited by a Neuroscience News editor.
  • Journal paper reviewed in full.
  • Additional context added by our staff.

About this AI and addiction research news

Author: Alex Walls
Source: University of British Columbia
Contact: Alex Walls – University of British Columbia
Image: The image is credited to Neuroscience News

Original Research: The findings will be presented at thr  2026 CHI Conference on Human Factors in Computing Systems 

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.