Humans are so social that we try to fit in with robots

Teaching robots simple social skills could make human interactions go smoother.
NAO robot
Can robots and humans keep beat with one another? Brett Davis / Flickr

Share

Humans, like most social animals, mirror each other’s mannerisms and facial expressions through an act psychologists call mimicry. More often than not, mimicry helps humans feel more positive about the person they’re with. 

“When humans are interacting, they adopt the other’s rhythms in terms of breathing, speaking, and moving—and it’s unconscious,” says Ghiles Mostafaoui, a researcher from the Université de Cergy-Pontoise in France. It is, he says, “kind of the social glue.” 

In some cases, mimicry can facilitate bonding across species, like between humans and monkeys. But can this imitation be observed between a human and a robot? “If we had this type of interaction with machines, or robots, or computers, you can obtain better, intuitive interactions,” Mostafaoui says.

He and his colleagues found through a small study published this week in PLOS ONE that humans do indeed mirror the movement of humanoid robots that they’re interacting with, and the rhythmic coordination between human and robot looks similar to that between human and human. 

“We had one or two publications with the proof-of-concept of it,” Mostafaoui says. “This one is the first one proving that humans can unintentionally coordinate with a robot, if the robot is moving in a way similar to humans.” 

Bop as the bot bops

The setup was simple. Fifteen human subjects were placed opposite a humanoid robot called NAO, seated on a table facing them. The subjects were instructed to place their arm straight out in front of them and move it up and down. The robot did the same. Its movement was controlled by an external computer running an algorithm that allowed the robot to either sync its arm movements with the human’s or move at a fixed pace, like a metronome. The humans were not informed of what the robot was going to do beforehand—they didn’t know if the robot was moving on its own, or reacting to how they were moving.    

Researchers asked each human subject to move their arms how they wanted to, regardless of what the robot was doing. Then, they asked the subject to try to match the robot’s rhythm. There was also a control condition in which the human subjects freely moved their arms while wearing a blindfold and headphones.

[Related: MIT scientists taught robots how to sabotage each other]

They observed that almost all subjects eventually matched up with the robot’s beat. The outlier was a dancer who always moved in a syncopated, or off-beat, manner with the robot. “We never succeeded in synchronizing her with the robot, even when the robot was controlled with the neural model I developed to synchronize it with the humans,” says Mostafaoui. “We think that she was more intentionally avoiding the robot’s rhythm.”

Most participants noticed that they were synchronizing with the robot, but they were unsure of whether it was themselves or the robot who was the one adjusting their tempo. Perhaps finding a way to measure intentionality through EEG or fMRI could be a next step in mapping this behavior pattern onto neurobiology, Mostafaoui proposes. 

There could be practical applications for this research. Mostafaoui was part of a team that used the NAO robot with schizophrenia patients to rehabilitate their movement coordinations. The idea was to use NAO to help patients calibrate their movement coordination. Many of these patients aren’t actively doing sports and can have social deficits that makes it hard for them to be coordinated, Mostafaoui explains, further degrading their ability to carry out normal social interactions. 

Recently, he’s been asked to study the robot’s effect on patients with catatonia, an affliction that causes people to freeze their movements uncontrollably. But what’s peculiar about this condition, Mostafaoui notes, is that if you move in front of them, they will unconsciously imitate you. He thinks that a bot could possibly help here.

Should engineers teach robots simple social skills?

Unintentional coordination is important in human social settings, maybe because it’s interlinked with attention and learning. And if it’s important for human-human interaction, it means that it’s important for human-machine interaction.

[Related: Do we trust robots enough to put them in charge?]

So, what does this mean for designing future robots? Engineers have started moving away from hard, mechanical robots towards softer ones that have human or animal physiology features, which could make motions and gestures look more natural. But for their presence in our world to feel more natural, Mostafaoui thinks that these robotic systems need to react to us. That means building new systems, robots, and algorithms with realistic motor and sensory reflexes, and they don’t have to be that complex to be effective. For their experiment, it was not so important that the robot looked human; rather, it was more important that it moved like one.

“If I try to predict everything you do, our interaction will not be very natural,” he says. “We don’t need to control, or predict the position of each articulation.” What makes two-way interactions feel fluid are the simple notions, like nodding your head when another person does to acknowledge that you’re listening.