Summary: Children aged between 4-11 judge Alexa to have more human-like thoughts and emotions than other AI robotics like the Roomba vacuum cleaner, but believe neither device should be yelled at or harmed. These feelings dwindle as children reach their teenage years.
Source: Duke University
Most kids know it’s wrong to yell or hit someone, even if they don’t always keep their hands to themselves. But what about if that someone’s name is Alexa?
A new study from Duke developmental psychologists asked kids just that, as well as how smart and sensitive they thought the smart speaker Alexa was compared to its floor-dwelling cousin Roomba, an autonomous vacuum.
Four- to eleven-year-olds judged Alexa to have more human-like thoughts and emotions than Roomba. But despite the perceived difference in intelligence, kids felt neither the Roomba nor the Alexa deserve to be yelled at or harmed. That feeling dwindled as kids advanced towards adolescence, however.
The findings appear online April 10 in the journal Developmental Psychology.
The research was inspired in part by lead author Teresa Flanagan seeing how Hollywood depicts human-robot interactions in shows like HBO’s “Westworld.”
“In Westworld and the movie Ex Machina, we see how adults might interact with robots in these very cruel and horrible ways,” said Flanagan, a visiting scholar in the department of psychology & neuroscience at Duke. “But how would kids interact with them?”
To find out, Flanagan recruited 127 children aged four to eleven who were visiting a science museum with their families. The kids watched a 20-second clip of each technology, and then were asked a few questions about each device.
Working under the guidance of Tamar Kushnir, Ph.D., her graduate advisor and a Duke Institute for Brain Sciences faculty member, Flanagan analyzed the survey data and found some mostly reassuring results.
Overall, kids decided that both the Alexa and Roomba probably aren’t ticklish and wouldn’t feel pain if they got pinched, suggesting they can’t feel physical sensations like people do. However, they gave Alexa, but not the Roomba, high marks for mental and emotional capabilities, like being able to think or getting upset after someone is mean to it.
“Even without a body, young children think the Alexa has emotions and a mind,” Flanagan said. “And it’s not that they think every technology has emotions and minds — they don’t think the Roomba does — so it’s something special about the Alexa’s ability to communicate verbally.”
Regardless of the different perceived abilities of the two technologies, children across all ages agreed it was wrong to hit or yell at the machines.
“Kids don’t seem to think a Roomba has much mental abilities like thinking or feeling,” Flanagan said. “But kids still think we should treat it well. We shouldn’t hit or yell at it even if it can’t hear us yelling.”
The older kids got however, the more they reported it would be slightly more acceptable to attack technology.
“Four- and five-year-olds seem to think you don’t have the freedom to make a moral violation, like attacking someone,” Flanagan said. “But as they get older, they seem to think it’s not great, but you do have the freedom to do it.”
The study’s findings offer insights into the evolving relationship between children and technology and raise important questions about the ethical treatment of AI and machines in general, and as parents. Should adults, for example, model good behavior for their kids by thanking Siri or its more sophisticated counterpart ChatGPT for their help?
For now, Flanagan and Kushnir are trying to understand why children think it is wrong to assault home technology.
In their study, one 10-year-old said it was not okay to yell at the technology because, “the microphone sensors might break if you yell too loudly,” whereas another 10-year-old said it was not okay because “the robot will actually feel really sad.”
“It’s interesting with these technologies because there’s another aspect: it’s a piece of property,” Flanagan said. “Do kids think you shouldn’t hit these things because it’s morally wrong, or because it’s somebody’s property and it might break?”
Funding: This research was supported by the U.S. National Science Foundation (SL-1955280, BCS-1823658).
About this AI and emotion research news
Author: Karl Bates
Source: Duke University
Contact: Karl Bates – Duke University
Image: The image is credited to Veronique Koch, Duke University
Original Research: Closed access.
“The Minds of Machines: Children’s Beliefs About the Experiences, Thoughts, and Morals of Familiar Interactive Technologies” by Teresa Flanagan. Developmental Psychology
The Minds of Machines: Children’s Beliefs About the Experiences, Thoughts, and Morals of Familiar Interactive Technologies
Children are developing alongside interactive technologies that can move, talk, and act like agents, but it is unclear if children’s beliefs about the agency of these household technologies are similar to their beliefs about advanced, humanoid robots used in lab research.
This study investigated 4–11-year-old children’s (N = 127, Mage = 7.50, SDage = 2.27, 53% females, 75% White; from the Northeastern United States) beliefs about the mental, physical, emotional, and moral features of two familiar technologies (Amazon Alexa and Roomba) in comparison to their beliefs about a humanoid robot (Nao).
Children’s beliefs about the agency of these technologies were organized into three distinct clusters—having experiences, having minds, and deserving moral treatment. Children endorsed some agent-like features for each technology type, but the extent to which they did so declined with age.
Furthermore, children’s judgment of the technologies’ freedom to “act otherwise” in moral scenarios changed with age, suggesting a development shift in children’s understanding of technologies’ limitations. Importantly, there were systematic differences between Alexa, Roomba, and Nao, that correspond to the unique characteristics of each.
Together these findings suggest that children’s intuitive theories of agency are informed by an increasingly technological world.