This shows two heads,
“Young people with social deficiencies tend to gravitate towards companionship with online social chatbots in particular,” Franze says. Credit: Neuroscience News

AI Chatbots May Hinder Social Skills in Neurodiverse Individuals

Summary: Researchers raised concerns about the impact of social chatbots on neurodiverse individuals and those with social interaction challenges. Their study highlights the potential of these AI tools to worsen social isolation and dependency, despite their initial appeal for safe and judgement-free interaction practice.

The research points out that chatbots’ lack of genuine conversation and emotional skills may reinforce unhelpful social habits. The team calls for more comprehensive studies to understand these impacts better and develop responsible industry practices for chatbot use.

Key Facts:

  1. Social chatbots may attract individuals with autism, anxiety, and limited social skills, offering a risk-free environment for interaction.
  2. The study warns that over-reliance on chatbots could hinder the development of real-world social skills and increase social isolation.
  3. The researchers advocate for broader evidence gathering, including feedback from educators and therapists, to guide safe chatbot usage.

Source: University of South Australia

Australian researchers have flagged potential concerns over the use of social chatbots, calling for more studies into the impact of the AI software on neurodiverse people and those who find human interaction difficult.

While the AI chatbot is appealing to many people who struggle with face-to-face conversations, the technology may foster bad habits that could lead to further social isolation.

That’s the view of University of South Australia and Flinders University researchers in a recent essay published in the Journal of Behavioural Addictions.

The researchers say that chatbots, now integrated into social networking platforms like Snapchat, could perpetuate communication difficulties for people with autism, anxiety and limited social skills.

Lead researcher, UniSA Psychology Honours student Andrew Franze, says the rapid development of social chatbots has pros and cons which need investigating.

“Young people with social deficiencies tend to gravitate towards companionship with online social chatbots in particular,” Franze says.

“They offer a safe means of rehearsing social interaction with limited or no risk of negative judgement based on appearance or communication style. However, there is a risk they can become dependent on chatbots and withdraw even further from human interactions.”

Franze says the inability of chatbots to have a real “conversation,” or display empathy and soft emotional skills, can reinforce dysfunctional habits in many neurodiverse people.

“Some chatbots have a generally servile quality and so there is no resistance or opposing view that characterises human conversations. This means that users can control the conversation completely; they can pause it, delay it, or even terminate the conversation. All of this is counterproductive to developing appropriate social skills in the real world.”

And while social chatbots may relieve social anxiety, this relief may develop into a form of dependency that negatively impacts on actual relationships.

The researchers say that industry-linked research has promoted the benefits of commercial chatbot applications, but feedback from parents, family members, teachers and therapists is needed to gain a broader understanding of its impacts.

“We need to gather evidence about the myriad of ways that these technologies can influence vulnerable users who may be particularly drawn to them,” Franze says. “Only then can we develop policies and industry practices that guide the responsible and safe use of chatbots.”

About this neurodiversity and chatbots research news

Author: Candy Gibson
Source: University of South Australia
Contact: Candy Gibson – University of South Australia
Image: The image is credited to Neuroscience News

Original Research: Open access.
Social chatbot use (e.g., ChatGPT) among individuals with social deficits: Risks and opportunities” by Andrew Franze et al. Journal of Behavioral Addictions


Abstract

Social chatbot use (e.g., ChatGPT) among individuals with social deficits: Risks and opportunities

Social chatbots powered by artificial intelligence (AI) may be particularly appealing to individuals with social deficits or conditions that affect their social functioning.

In this letter, we discuss some of the noteworthy characteristics of social chatbots and how they may influence adaptive and maladaptive behaviors, including the potential for ‘dependency’ on chatbots.

We call for more independent studies to evaluate the potential developmental and therapeutic effects of this increasingly popular technology.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.
  1. Dumb. As an autist I do only use chatbots for feedback on writing. They’ve become more mediocre if anything.

    I most certainly do not use them as an alternative to interaction. Anxiety ridden people may very well do so but autism is not anxiety-primary. As an autist I hold my social interactions with people when I hold them. I do not hold them with cats, or dogs, which I do not have, and I do not hold them with inanimate objects. All of these things are the purview of neurotypicals and other varieties of neurodivergence, namely, people afflicted by anxiety and unconcentrated interest are unable to motivate.

  2. Individuals are not neurodiverse. Individuals can be neurodivergent.

    Groups can be neurodiverse if they include anyone who is neurodivergent.

  3. I understand the idea of neurodiversity in a group of people, but what, exactly, is a neurodiverse *individual* ?

  4. Well, what do you expect by trying to make life completely artificial and virtual? Problems like this will be the obvious result. The farther we venture from nature, the more trouble we will invite…

  5. The article is vague about who are these socially limited persons, in what ways that they are, what sets them apart. I have difficulty in accepting socially withdrawn people find chatbots much different from real people, considering that A.I. currently is simply a complex common denominator.

Comments are closed.