SHARE

Daniel Kish is the President of World Access for the Blind. Here’s his tale from the field as told to Nicole Wetsman.

I don’t remember learning to echolocate. When I was an infant, I had cancer and had to have my eyes removed. I started clicking my tongue simply by instinct. Now I teach my methods to other blind people—adults, kids—to help them move around unassisted and regain independence.

When I click my tongue, the sound waves echo back. The longer the time delay between the noise emitted and the return, the farther away an object is. My brain operates differently from that of people who don’t have this skill, and I’ve had it scanned by scientists who study this ability. Their work helps us refine our teaching methods to help more people. It also gives the process more validity.

A lot of what they learned are things I would have thought based on my own practice. For example, researchers found that the brain’s visual cortex, which processes incoming information from the eyes, plays a key role in echolocation. As a blind person learns this skill, that area (and any connecting regions) changes. It begins treating sound the same way it would treat messages from the eyes. The noggin takes in data and then forms it into various types of usable information like images or clues about depth perception.

What we refer to as the visual system is really more like the imaging system. For me, this redefines what it means to see and be blind.

This story originally published in the Noise Winter 2019 issue of Popular Science.