Now, computer scientists at the University of California, San Diego are applying this same strategy to robotics research. Through the use of machine learning, they've made it possible for their robot–an Einstein lookalike–to teach itself to make realistic facial expressions.
The team has been upgrading their Einstein-bot for the past several years, and in the past had to manually program the robot's 31 servo-driven facial "muscles" for it to form the right expressions. Now the robot has learned to guide itself. To quote the team's release:
To begin the learning process, the UC San Diego researchers directed the Einstein robot head (Hanson Robotics' Einstein Head) to twist and turn its face in all directions, a process called "body babbling." During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot could master facial expressions it had never encountered before.
So far, Einstein has managed to learn to convey emotions like sadness, anger, joy, and surprise. It's only a matter of time before it can stick out its tongue.
Hey guys, commenting is back, by the way. Sorry for the hiccup there.
That's supposed to look like Einstein?! Looks more like my grandma!
that doesnt really look a lot like einstein
Is "teachest" a new word? Didn't find that in Merriam-Webster.
It is just scary how real this looks and how far we have come from the first metal/plastic robots of 10-15 years ago.
This robot freaks me out!
Mira - http://www.teeth-diseases.com