Sign language can greatly improve the communication capability of hearing-impaired people, but there’s still a major barrier in that most people don’t understand it. New prototype gadgets could change that, by automatically translating hand motions into audible speech that a non-signing person can interpret.
Students at the University of Houston designed a device called MyVoice, which uses a video camera to capture a person’s sign language movements. It also contains a small video monitor, a microphone and a speaker. Software processes the images and determines what was said, and then translates the word or phrase into speech, which is transmitted through an electronic voice.
It also works backward, capturing a person’s spoken words and projecting the appropriate hand sign onto the monitor. Students sampled a database of images to train their software to recognize the hand signs, according to a UH news release. The team used between 200 and 300 images per sign.
It’s not clear how well the translation algorithms work — so far, the device was able to translate a single phrase: “Good job, Cougars,” congratulating the students who designed it. The team has since graduated, but the team members hope to further development of their prototype and eventually build a functional, marketable device, according to industrial design student Sergio Aleman.
The MyVoice prototype recently won a first-place award at an American Society of Engineering Education conference.