DURHAM, N.C. — Kids can be cruel, but they can also be quite kind — even to machines. Researchers from Duke University report that children judged Amazon’s Alexa as smarter and more human-like than a robot vacuum like Roomba. Moreover, kids believed that people should not yell at or harm either device. Interestingly, that feeling tended to fade as young children reached adolescence.
Study authors asked a group of four to 11 years-olds how smart and sensitive they thought the smart speaker Alexa was in comparison to its floor-dwelling robotic cousin, Roomba. Lead author Teresa Flanagan says this research was partially inspired by Hollywood’s depictions of human-robot interactions in shows like HBO’s “Westworld.”
“In Westworld and the movie Ex Machina, we see how adults might interact with robots in these very cruel and horrible ways,” says Flanagan, a visiting scholar in the department of psychology & neuroscience at Duke, in a university release. “But how would kids interact with them?”
So, to find out, Flanagan recruited 127 children (ages 4-11) who had been visiting a science museum with their families. Participating children watched a 20-second clip of each technology, and then answered a few questions about each robotic device. Next, with help from Tamar Kushnir, Ph.D., her graduate advisor and a Duke Institute for Brain Sciences faculty member, Flanagan analyzed the collected data, uncovering mostly reassuring results.
Overall, kids said that both Alexa and the Roomba probably aren’t ticklish and wouldn’t feel pain if someone pinched them, suggesting neither can feel physical sensations like people do. Alexa, on the other hand, received higher marks for mental and emotional capabilities, like being able to think or getting upset after someone is mean to it, while the Roomba did not.
“Even without a body, young children think the Alexa has emotions and a mind,” Flanagan explains. “And it’s not that they think every technology has emotions and minds — they don’t think the Roomba does — so it’s something special about the Alexa’s ability to communicate verbally.”
Still, regardless of perceived abilities among the two devices, children across all examined ages agreed it would be wrong to hit or yell at the machines.
“Kids don’t seem to think a Roomba has much mental abilities like thinking or feeling,” Flanagan notes. “But kids still think we should treat it well. We shouldn’t hit or yell at it even if it can’t hear us yelling.”
Notably, though, as the kids got older, the more they believed that it would be slightly more acceptable to attack technology.
“Four- and five-year-olds seem to think you don’t have the freedom to make a moral violation, like attacking someone,” Flanagan continues. “But as they get older, they seem to think it’s not great, but you do have the freedom to do it.”
These findings offer important insights into the ever-evolving relationship between children and technology, but also raise a number of valid questions concerning the ethical treatment of AI and machines in general. Moreover, how should parents navigate all of this? Should adults, for example, model good behavior for their kids by thanking Siri or its more sophisticated counterpart ChatGPT for its help?
Moving forward, Flanagan and Kushnir are currently working to understand why children think it is wrong to assault home technology.
During the study, one 10-year-old said it was not okay to yell at the technology because, “the microphone sensors might break if you yell too loudly.” Another 10-year-old said it was not okay because “the robot will actually feel really sad.”
“It’s interesting with these technologies because there’s another aspect: it’s a piece of property,” Flanagan concludes. “Do kids think you shouldn’t hit these things because it’s morally wrong, or because it’s somebody’s property and it might break?”
The study is published in the journal Developmental Psychology.