Summary: Researchers raised concerns about the impact of social chatbots on neurodiverse individuals and those with social interaction challenges. Their study highlights the potential of these AI tools to worsen social isolation and dependency, despite their initial appeal for safe and judgement-free interaction practice.
The research points out that chatbots’ lack of genuine conversation and emotional skills may reinforce unhelpful social habits. The team calls for more comprehensive studies to understand these impacts better and develop responsible industry practices for chatbot use.
Key Facts:
- Social chatbots may attract individuals with autism, anxiety, and limited social skills, offering a risk-free environment for interaction.
- The study warns that over-reliance on chatbots could hinder the development of real-world social skills and increase social isolation.
- The researchers advocate for broader evidence gathering, including feedback from educators and therapists, to guide safe chatbot usage.
Source: University of South Australia
Australian researchers have flagged potential concerns over the use of social chatbots, calling for more studies into the impact of the AI software on neurodiverse people and those who find human interaction difficult.
While the AI chatbot is appealing to many people who struggle with face-to-face conversations, the technology may foster bad habits that could lead to further social isolation.
That’s the view of University of South Australia and Flinders University researchers in a recent essay published in the Journal of Behavioural Addictions.
The researchers say that chatbots, now integrated into social networking platforms like Snapchat, could perpetuate communication difficulties for people with autism, anxiety and limited social skills.
Lead researcher, UniSA Psychology Honours student Andrew Franze, says the rapid development of social chatbots has pros and cons which need investigating.
“Young people with social deficiencies tend to gravitate towards companionship with online social chatbots in particular,” Franze says.
“They offer a safe means of rehearsing social interaction with limited or no risk of negative judgement based on appearance or communication style. However, there is a risk they can become dependent on chatbots and withdraw even further from human interactions.”
Franze says the inability of chatbots to have a real “conversation,” or display empathy and soft emotional skills, can reinforce dysfunctional habits in many neurodiverse people.
“Some chatbots have a generally servile quality and so there is no resistance or opposing view that characterises human conversations. This means that users can control the conversation completely; they can pause it, delay it, or even terminate the conversation. All of this is counterproductive to developing appropriate social skills in the real world.”
And while social chatbots may relieve social anxiety, this relief may develop into a form of dependency that negatively impacts on actual relationships.
The researchers say that industry-linked research has promoted the benefits of commercial chatbot applications, but feedback from parents, family members, teachers and therapists is needed to gain a broader understanding of its impacts.
“We need to gather evidence about the myriad of ways that these technologies can influence vulnerable users who may be particularly drawn to them,” Franze says. “Only then can we develop policies and industry practices that guide the responsible and safe use of chatbots.”
About this neurodiversity and chatbots research news
Author: Candy Gibson
Source: University of South Australia
Contact: Candy Gibson – University of South Australia
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Social chatbot use (e.g., ChatGPT) among individuals with social deficits: Risks and opportunities” by Andrew Franze et al. Journal of Behavioral Addictions
Abstract
Social chatbot use (e.g., ChatGPT) among individuals with social deficits: Risks and opportunities
Social chatbots powered by artificial intelligence (AI) may be particularly appealing to individuals with social deficits or conditions that affect their social functioning.
In this letter, we discuss some of the noteworthy characteristics of social chatbots and how they may influence adaptive and maladaptive behaviors, including the potential for ‘dependency’ on chatbots.
We call for more independent studies to evaluate the potential developmental and therapeutic effects of this increasingly popular technology.