Most people use the video camera on their phone for bootlegging concert footage or recording drunken antics. But for the deaf, to whom cellphones' audio capability is moot, cellphone video offers a chance to expand beyond texting, and into the more expressive communication of American Sign Language. Unfortunately, low-bandwidth American cellphone lines can't carry video clear enough for sign language.
That's where MobileASL comes in. This multi-university project developed a special algorithm that selectively compresses the video, lowering the resolution on everything but the speakers hands. This lowers the size of the video to the point where it can pass over regular cellphone signals. And now, the MobileASL project has developed the first prototype phone that incorporates this technology.
The algorithm uses skin-tone sensing and motion tracking to separate the hands from the rest of the image. Once the hands are identified, the phone's computer dials the fidelity of the rest of the image way down, while increasing the sharpness of the speaker's hands. The resulting video has hands clear enough for ASL users to understand, but a total size small enough for American cellphone carriers to handle.
European and Asian users have utilized their phones for sign language for years, since their high-speed cellphone networks can handle uncompressed video featuring clear hands without the help of a fancy new program. Now Americans can join the party.
For more on how MobileASL works, and for a shot of the prototype device, check out this video:
because we all know it is important for deaf people to use the phone, or perhaps they could text like they do now...
It's a good idea to make phones more accessible even to deaf people. Maybe soon they'll make the technology more mainstream so that all cell phone videos can be longer but still good resolution in the main areas.
this is very stupid technology. first off, if you are signing you have to put the camera far enough away from you for the phone to see you clearly and it cant be shaking much if at all. and the person who you are signing to is most likely deaf, so he has to see the signs, which means the screen which is 4 feet away better have a larger screen.
i can solve this problem fast. its called text messaging.
Man, can't you just think about it? It's a trial-error process. They'll figure it out.
Agreed. This may be an alternative to texting, since some deaf people I know have bit of an issue when it comes to writing, let alone texting. (Don't ask me why, I'm pretty sure writing issues are not connected to being deaf)
So is this suppose to be the safer alternative than texting while driving?
This is an interesting idea, but as a student who is learning ASL, it isn't the smartest idea to be focusing on the hands of a singer. You are supposed to focus on the face and watch their expressions, then look at their hands from your peripheral vision. A vital part of what they are saying comes from their facial expression, so perhaps they should think this out a tad better.
Although, texting is a lot easier.