Could a computer program catch what a human psychiatrist can’t? A new program called SimSensei, still in the early stages of development, logs people’s subtle body language and fleeting facial expressions to help diagnose depression, the New Scientist reported.
The program even comes with an animated avatar who asks patients questions, “Hmms” at appropriates times, and guides the conversation according to patient’s answers… all while tracking the patient’s movements using Microsoft Kinect sensors and face recognition software.
Right now, a depression diagnosis depends on patients’ answers to standard questionnaires. It doesn’t take non-verbal cues into account, Stefan Scherer, SimSensei’s lead developer and a researcher at the University of Southern California’s Institute for Creative Technologies, told the New Scientist. That could lead to missed diagnoses.
SimSensei is one of several programs under development now that look to log the differences in how people with depression make eye contact, smile, shift in their chairs, and give off other small clues to their condition. The Association for Computing Machinery is even hosting a contest between the programs, to see which is best at picking out depressed patients among a database of videos of depressed and non-depressed people.
Watch SimSensei at work here: