Now, computer scientists at the University of California, San Diego are applying this same strategy to robotics research. Through the use of machine learning, they’ve made it possible for their robot–an Einstein lookalike–to teach itself to make realistic facial expressions.
The team has been upgrading their Einstein-bot for the past several years, and in the past had to manually program the robot’s 31 servo-driven facial "muscles" for it to form the right expressions. Now the robot has learned to guide itself. To quote the team's release:
To begin the learning process, the UC San Diego researchers directed the Einstein robot head (Hanson Robotics’ Einstein Head) to twist and turn its face in all directions, a process called “body babbling.” During this period the robot could see itself on a mirror and analyze its own expression using facial expression detection software created at UC San Diego called CERT (Computer Expression Recognition Toolbox). This provided the data necessary for machine learning algorithms to learn a mapping between facial expressions and the movements of the muscle motors.
Once the robot learned the relationship between facial expressions and the muscle movements required to make them, the robot could master facial expressions it had never encountered before.
So far, Einstein has managed to learn to convey emotions like sadness, anger, joy, and surprise. It's only a matter of time before it can stick out its tongue.
Five amazing, clean technologies that will set us free, in this month's energy-focused issue. Also: how to build a better bomb detector, the robotic toys that are raising your children, a human catapult, the world's smallest arcade, and much more.