Computer Vision Is Better At Seeing Your Secret Emotions Than Humans
Photo by Steve Depolo via Flickr, licensed under CC 2.0
SHARE

A lot of people aren’t able to perceive micro expressions. They’re the tiny facial ticks that reveal what we’re feeling when we’re trying not to let it show. So, understanding those micro expressions could allow you to look deeper into peoples’ faces and know what what’s really going on inside.

But even people who can perceive micro expressions aren’t always accurate. In 2012, researchers in Finland described what they claimed to be the first system that used a computer to detect micro expressions. In the paper, they write that computers are particularly attractive in this field, since humans are only correct about 47 percent of the time.

And computers have only gotten better at their craft. In a paper submitted to arXiv, Xiaobai Li and a team of researchers (also in Finland), share their new machine vision algorithm. And they say it’s better at reading human faces than humans themselves. To test it, they first needed a database of what all of these micro expressions look like. To create the database, they asked 20 study participants to watch videos designed to elicit an emotional response. But, they were told, if they did demonstrate a response, they would have to complete a long questionnaire explaining those emotions (a brilliant deterrent). Turns out, it worked and the researchers were able to gather 164 micro expressions using a high speed camera.

Then, the algorithm needed to learn how to recognize and interpret those expressions. The algorithm gets its super powers by being able to “magnify” the micro expressions by isolating the parts of the face that are moving, and distorting that area to move further. The algorithm then learned which emotion is attached to those particular movements.

But the real test is how does it stack up. Li and the team pitted the algorithm against lowly humans, having both watch videos of just micro expressions. Participants were instructed to identify which emotion went with which expression. Team human was about 72 percent accurate, and the computer’s best accuracy was about 82 percent. The second task involved picking out micro expressions from a longer video. The researchers made it easier on humans (allowing them to replay the videos, and only having them note how many micro expressions they saw, not when they saw them), since they felt the task was “too difficult.” Even with that advantage, humans scored an accuracy of about 50 percent, while the computer earned a comparable 42 percent.

All of this means, we’re going to have to get a lot better at hiding our feelings from our computers.

[Via MIT Technology Review]