Imagine Cup 2012: Gloves That Translate Sign Language Into Spoken Language
The project that ended up winning the software design category of the Microsoft Imagine Cup worldwide finals in Sydney, Australia,...
The project that ended up winning the software design category of the Microsoft Imagine Cup worldwide finals in Sydney, Australia, and, consequently, the honor of taking home the shiny silver Cup itself, started innocently enough. A group of students from Ukraine noticed that several athletes at their school were hearing impaired, and they wanted to help them be able to communicate better. That desire turned into a pair of gloves, absolutely loaded with sensors, that can understand sign language gestures and translate them into text-based and audio speech with 90 percent accuracy. And that’s just the prototype.
In the Enable Talk system developed by Team Quadsquad, each individual glove has 11 flex sensors—two in each finger and one across the palm—as well as two accelerometers, a gyroscope, a compass and a bluetooth module that transfers all the data collected by the gloves as the user signs to a mobile device. “We gave [the athletes at our school] our prototype,” says team member Maxim Osika. “In the beginning, we thought it would be enough just to have the fingers, but they said they wanted broader gestures.” So the team added all the additional bells and whistles that would allow the gloves to recognize not only the alphabet, as formed by bending the fingers, but words and sentences made up of more sweeping hand motions as well.
At Team Quadsquad’s final presentation to the Imagine Cup judges, a member wearing the gloves signs “nice to meet you” and a mobile phone using Microsoft Speech API quickly repeats it out loud. There is an audible “ooh” from the audience. The Enable Talk system is user-taught, which makes it customizable for different countries’ sign languages and individual idiosyncrasies in performing the signs. For the Imagine Cup, Team Quadsquad translated American Sign Language to English using the system’s neural network, but a user could teach it any gestures they wanted, and have them correspond with any language that is supported by Microsoft Speech API. So, theoretically, you could even make up your own sign language. Obviously, it would take far too long to teach the system every word you might ever want to say, so eventually the team would like to have a downloadable package of standard signs in different sign languages for the user to start out with. The battery life on the gloves lasts for eight hours, but can be extended by charging them with the solar panels on the back.
The level of detail the sensors pick up is truly impressive. A screen full of graphs (pictured above) shows the judges and the audience during the presentation what each individual sensor is reading as the team member wearing the gloves waves and turns his hands and bends his fingers. If he bends just his pinky, that is the only sensor that shows a reading. The others remain static. Throughout the presentation, the system never once falters, saying things like “we love the Imagine Cup” and “we want to see kangaroos.” Still, Team Quadsquad isn’t satisfied with just 90 percent accuracy. “We’re shooting for 99.9 of course,” says Osika. And winning the Imagine Cup may just give them the boost they need to get there.