Robots photo
SHARE

At a cocktail party, during a baseball game or in any other crowded, noisy place, your brain has to do some high-level filtering so you can separate important sounds from the clanking background. Turning your head helps with this, and researchers are studying how these head movements cause a shift in auditory perception. A team in Japan did the natural thing and used a humanoid robot to figure it out. Watch below as it affably mimics a headphone-wearing human.

Hirohito M. Kondo and colleagues from NTT Corporation wanted to test how voluntary head movements change the flow of acoustic information, so they used a telepresence robot to cleanly imitate sound. First, human volunteers sat in a room with a speaker, which blared a combination of two different tones. At first, it sounded like noise, but the volunteers were quickly able to separate the two tones. Then the researchers brought in the robot.

The volunteers heard two-tone sounds via the ears of the robot, which is called Telehead. Microphones inserted into Telehead’s ear canal picked up the sounds, which were transmitted in real-time to the human’s headphones. While the humans were listening, the researchers would tell the person to move his or her head, and the Telehead would move in sync. Through a series of experiments, the team says they were able to figure out how this voluntary head motion affects sound perception — it turns out that perception resets just after the motion. The brain has to re-filter all the sounds to come up with a clearer auditory signal. The signal also benefits from the spatial cues your brain picks up during motion — you can better determine the direction sounds are coming from.

This self-directed movement is part of how our brains actively sense the environment, the researchers say.

The research appears in this week’s edition of the Proceedings of the National Academy of Sciences.