Device Trains Blind People To ‘See’ By Listening

Seeing without sight

Share

Researchers have found that newly blind people can learn to “see” with their ears. The key: technology called sensory substitution devices (SSDs) that can convert visual stimuli into aural representations of the surroundings, allowing users to reacquire lost abilities; a blind man who drops his keys might be able to locate them.

The devices could eventually duplicate “the phenomenological experience of vision” in blind and partially blind patients, the researchers say, and could obviate the need for costly eye surgeries and other treatments.

Previous studies have shown that SSDs help congenitally blind people navigate their environments. This most recent study, led by Michael Proulx from the University of Bath and published in Frontiers In Cognitive Science, instead tested how quickly and effectively blindfolded, sighted people could use one such device, the vOICe. The point? People who’ve been blind most of their lives typically compensate for their lack of vision with sharpened hearing and other senses. Not so with the newly blind. The researchers wanted to prove that their device could work on those who’ve just lost their sight, too.

The vOICe consists of a backpack carrying a laptop, some sunglasses equipped with a camera, and earbuds. Here’s how it works, according to the study:

The vOICe converts images captured by a camera into “soundscapes” delivered to the user through headphones at a default rate of one soundscape per second. Each soundscape is a left to right scan of the visual scene with frequency representing the image’s vertical axis and loudness representing brightness… The user therefore experiences a series of “snapshots” passing from the left to the right ear.

A simple example would be a black screen with four slanted white lines, a bit like this (but with inverted colors):
/ / / /
The “soundscape” for that image would be four distinct, loud bleeps, each with an escalating pitch. Actually navigating the big outside world would be challenging, but–with enough practice and skill–perhaps doable. One user described it as “Like figuring out where you’re walking in the dark,” the study notes. For those who learn best through experience, this video does a good job of simulating it (warning: harsh, loud noises):

Vision is measured in terms of acuity: 20/20 vision means that you can see from 20 feet what a normal person can see (often in terms of the eye chart); 20/40 means you can see from 20 feet what a normal person would be able to see from 40 feet. In the U.S., legal blindness is 20/200 (or worse).

In Proulx’s study, blindfolded subjects equipped with the vOICe (or just headphones playing the associated sound profile, depending on the experiment) were asked to report on the orientation of E’s from a Snellen eye chart on a computer screen in front of them. The size of the E’s corresponds to the acuity rating, as this chart displays. The vOICe allowed most subjects to report with 75 percent accuracy at an acuity of 20/2464-20/4682, with an absolute upper-bound of 20/408 (which means the letters had to be pretty big for them to accurately say which direction they faced). Keeping in mind both that this isn’t actual vision and that these are inexperienced users, the results are promising, the researchers say. Of course, there’s a big difference between using the device in a controlled lab setting and actually taking it out on a noisy, crowded city street. And I bet the constant, screechy noise would take some adjusting to. Nevertheless, I’m sure this guy is intrigued.

[Via Frontiers In Cognitive Science]

 
The best Black Friday deals including a jackery generator, airpods, a TV arranged on a plain background.

PopSci's Guide to Cyber Monday

The best Cyber Monday sales, deals, and everything else you need to know. Our team spends hundreds of collective hours searching and evaluating every deal we can find online, focusing on well-made and reviewed products for prices that make sense.