Maybe people open up to Ellie because talking to it is “like talking to a dog instead of a human,” University of Southern California researcher Louis Phillip Morency told the reporters gathered into his small booth inside the Pentagon’s courtyard. Ellie is an animated person on a screen, an AI-powered therapist whose job it is to listen and who is very, very good at its job. Popular Science was there for the same reason Morency was talking about Ellie; last Wednesday was DARPA’s Demo Day, where the agency shows off a plethora of cool projects by diverse teams and clever scientists. There were 112 booths total, and inside booth 81 there was a video of a robot asking a man about how much sleep he got.
Ellie is visible on a TV screen, but she’s hardly the only part of the SimSensei system. Above it is a camera, capturing the gaze and body language of subject. This part of the system also includes a Kinect, for better readings of body language, and combined it’s called “MultiSense.” Last year, the program made news for its ability to diagnose depression, with a machine picking up on the monotone voice, long pauses, and gaze aversion that all indicate a depressed state. What was DARPA doing showing off year-old tech? Well, the answer lies both in the past and the future.
SimSensei grew out of USC Institute Creative Technology programs on enhancing cultural understanding. Body language, the gestures and ways of speaking done regularly in day-to-day life, are picked up passively and practiced over a lifetime, but the signal and manners vary from place to place. This technologoy was originally developed as a a way for troops preparing to go into a new country to understand the norms of that country, so they didn’t cause offense to make a bad situation worse with the wrong gesture or by looking away at the wrong time.
It turns out that the technology might be more useful for troops coming home than going abroad. Currently, SimSensei is being evaluated with a Colorado National Guard unit. The group had sessions with SimSensei before deploying this past January, to get a baseline reading. Upon return in December, the group will again have sessions talking to Ellie, and if budget and time allows there will be a third session, six months after the troops return from deployment. If all works out, Ellie, as a tool for counselors, could listen to people talk for an hour, and doctors could use the analysis provided by SimSensei & MultiSense to better understand patients, and help them with depression and PTSD.
According to a study done by ICT researchers Jonathan Gratch,Gale Lucas, Aisha King and Morency, the patients, too, preferred talking to the machine. A study compared two groups going in to see Ellie. One group was told a human was operating Ellie, like the Wizard of Oz behind a curtain, while the other group was told Ellie was just an AI, responding to them. Afterwards, those who were told they were only talking to an AI reported being more relaxed than those told they were talking through an AI to a person. In addition, the people who were told they interacted with an AI reported less fear of self disclosure than those who were told there was a human operating Ellie.
In addition to the wartime research, SimSensei has applications beyond the battlefield. The projects aim is, according to Morency, “making subjective emotions objective,” and there are two immediate future applications. The first is a tool for helping young adults on the autism spectrum adapt to the social mores of the adult world, by giving them an automated partner to converse with. Another application is also focused on helping teens into adulthood. SimSensei is currently working with the Cincinnati Children’s Hospital to help screen teens that show up at the hospital with suicidal ideations, differentiating through body language and speech patterns those more at risk than others. Morency was fast to emphasize that SimSensei is not capable of diagnosing on its own, but instead is a tool doctors can use.
Watch a demonstration of Ellie and SimSensei below: