SHARE
Robots photo

Two years ago, a yellow spongiform robot named Keepon became a minor YouTube sensation when one of its creators programmed it to do a squishy, twisty dance in time to the Spoon song “I Turn My Camera On.” The video has garnered more than 2 million hits. Now Keepon’s keepers, Marek Michalowski, a Ph.D student in robotics at Carnegie Mellon University, and Hideki Kozima of Miyagi University in Japan, are turning Keepon’s attention to a more serious task: to study how children with autism spectrum disorders (ASD) interact socially and to see if the robot may be able to help in therapy.

Keepon is just one of the many new robots that researchers are using to study and to help children with ASD. The robots do everything from studying the children’s social interactions and their emotional states to drawing them out socially.

Children with ASD often have trouble with the “dance” of body language and facial gestures needed to have successful conversations and social contact with others. Both reading the intentions of others and knowing their own emotions can be a struggle, and children often become stranded both emotionally and socially.

One baby in every 150 born today in the U.S. is diagnosed with ASD. Treatment involves a combination of therapies — behavioral, educational, physical, occupational, and speech — that is costly and not always effective. After finding that children with ASD interact more easily with robots than with people, researchers began developing expressive and interactive robots that can assist them in studying and creating effective therapy for the children.

Keepon’s gentle boogieing and its simple, innocuous appearance (five inches tall, rubbery, resembling two tennis balls stacked one on top of the other) make it perfect for interacting with socially withdrawn children. Armless and legless and only possessing two eyes and a nose, Keepon expresses itself mainly through its four movements: nodding, turning, rocking and bobbing.

However, Keepon does have a camera behind those eyes and a microphone hidden in its nose. Researchers Michalowski and Kozima have studied preschool children with ASD in Japan and have found that interacting with the robots draws the children into a range of new social behaviors. Videos of those encounters show the children feeding Keepon imaginary food, giving it imaginary medicine when it has a Band-Aid on its head, and protecting it against abuse by other children.

The most striking video shows one girl slowly forging a relationship with the robot. At first she refuses to even directly look at Keepon, but as the days go on, she draws closer to the robot, eventually touching it with a xylophone stick, then her hand. After weeks, she can be seen looking into Keepon’s eyes, putting a hat on it, and even giving it a kiss, an action she rarely performed even towards her own mother.

Researchers Maja Mataric and PhD student David Feil-Seifer of the University of Southern California’s Viterbi School of Engineering recently used a colorful roving and bubble-blowing robot to test the power of robots for research into ASD. The robot had two settings: one in which it blew bubbles and rolled around on its own, and the other in which it performed those actions only when the child pushed a button on it.

They found that when the children were able to control the robot’s behavior, their sociability increased — they interacted and spoke more not only to the robot but to the researchers and their parents as well.

Mataric and Feil-Seifer are also working with Dr. Marian Williams from Children’s Hospital in Los Angeles to replace the toys in the hospital’s therapist-led playtime with robots that blow bubbles, toot horns and smile, all of which engage the children.

Children with autism are often poor at giving outward emotional cues of their inner states and robots are being developed to help with this as well.

Nilanjan Sarkar, associate professor of mechanical engineering at Vanderbilt University has developed a method that uses several physiological measurements — heart rate, galvanic skin response, temperature and muscle response, among others — to monitor a subject’s emotional state. He has teamed up with a leading autism researcher, Vanderbilt Professor of Pediatrics Wendy Stone, to use the method to study teenaged children with ASD.

They believe if they can detect when the child is becoming upset or anxious, then they can help the child identify his or her own emotional state and figure out ways to intervene.

In the experiment, they attached the physiological sensors to the participants as they played two games, the computer game Pong and a modified version of Nerf basketball with the hoop and backboard attached to the end of a robotic arm that moved it back and forth or up and down.

Using the data from the sensors, the researchers modeled the emotional landscape of each child and could even predict his or her emotional states of liking, anxiety and engagement. This data was also used in real time to alter the game in ways that increased the children’s degree of engagement.

The data could also be used to identify the trigger that cause the child to withdraw — for some it is eye contact, others loud noises, others close proximity to people. Once a particular trigger is identified, a robot could be programmed to either increase the stimulus gradually to build up tolerance, or to back off entirely when it senses that the child is overwhelmed.