Ultrasonic Helmet Lets Anyone ‘See’ Like A Bat

Harnessing the power of aural navigation

Share

“He clicks with his tongue as a way of understanding where he is in space. This is basically what bats do.” That’s how the science podcast Invisibilia recently described Daniel Kish, a blind man who taught himself how to navigate by echolocation. But their description slightly misses the mark. While both humans and bats can paint visual landscapes from echoes, the pointy-eared flyers possess a stark advantage: ultrasonic sound.

Those higher frequencies, which offer a much crisper picture of the world, underlie the Sonic Eye, a helmet that replicates bat echolocation.

“We were wondering whether humans needed special neural wiring to echolocate, or whether a human brain could do it with the same audio info that’s available to a bat with ears designed for ultrasonic sounds,” says Stanford theoretical neuroscientist and co-creator Jascha Sohl-Dickstein.

Invented as a side project by Sohl-Dickstein and his former colleagues at the University of California, Berkeley, the device features a speaker at its crown, which emits ultrasonic chirps like a bat. When the echoes rebound off objects, the sound waves travel into two bat-shaped ears — called pinna — that rest on either side of the helmet and help gauge the direction of the echo. Molded from clay, each pinna has an ultrasonic microphone embedded at the center. A computer program records the echoes and instantly slows them by a factor of 20.

Dropping the pace and the pitch makes the imperceptible ultrasonic echoes audible to the human ear. Sonic Eye wearers can then use the echo delay to judge distance or mentally track their surroundings (see video below). In a study published last month, the team shows that blindfolded wearers of the Sonic Eye can judge whether a dinner plate, was moved left/right or up/down by ~20 centimeters — just over the length of a dollar bill.

Along with possibly assisting the blind, the new device presents a good case that the human mind is innately capable of comprehending high-definition soundscapes, like bats do. Other assistive devices have tried to harvest ultrasonic echoes, but they typically reprocess the sounds, discarding large amounts of spatial information.

“That’s the novelty here. A person uses The Sonic Eye to make sound judgments about the environment, but it doesn’t do anything to the signal apart from downsampling it,” says Lore Thaler, a psychologist at Durham University in the United Kingdom, who wasn’t involved in creating the device.

Thaler specializes in human echolocation, and her research has shown that sound perception for expert echolocators resides mentally somewhere between vision and hearing. When blind echolocators like Kish sit in an fMRI, click their tongues and hear echoes, their vision centers light up with brain activity, much like when a sighted person sees something.

But here’s a cool twist. When both the blind and sighted try echolocation, another brain area connected with understanding visual motion switches on.

“It seems the brain processes echolocation somewhat separately from information for other types of sounds,” says Thaler. “Echolocation is not just hearing like everything else, but a special form of spatial audition that the brain possibly keeps set apart from other aspects of hearing.”

For now, blind echolocators are far better at the skill than sighted individuals trained in the art, because many, like Kish, developed the talent as a child. It took months or years to perfect. The question is whether the same would apply with the Sonic Eye.

“In theory, you could get a finer resolution with an ultrasonic signal versus what an echolocator would make with their tongue,” says sensory neuroscientist and co-developer Santani Teng who now works at MIT. An ultrasonic bat chirp, due to shorter wavelength, bounces more sound waves off an object than any echo made by a human voice. An ultrasonic echo has more pieces bouncing back, which offer more spatial information for the brain to parcel. Bats can perceive differences as small as 6 mm – or the thickness of three nickels stacked on top of each other.

A better audio-spatial picture, using ultrasound, might expedite the echolocation learning process for humans. Plus most human echolocators with blindness still use a walking cane, says Thaler, because they have trouble with judging elevation and detecting obstacles near the floor. She says that it would be interesting to see if future users had an easier time of tracking things on the ground.

“You don’t want to block out sensory cues that people need to navigate,” says Teng. For instance, bats can modify their ultrasonic pulse based on the size of the prey that they’re hunting. A blind user should be able to change the ultrasonic output as much they want, says Teng.

But before moving into studies with the blind, the team wants to miniaturize their current prototype into a headband, says co-developer Benjamin Gaub, a Berkeley PhD student in neuroscience who is developing the Sonic Eye into a product suitable for the visually impaired. Currently, the Sonic Eye requires a laptop in a backpack that holds the device’s software, but with a little tweaking, the simple program could be run on a microchip or a smart phone. The team will also consult with the blind community in the Bay Area to customize additional features.