The volunteers heard two-tone sounds via the ears of the robot, which is called Telehead. Microphones inserted into Telehead's ear canal picked up the sounds, which were transmitted in real-time to the human's headphones. While the humans were listening, the researchers would tell the person to move his or her head, and the Telehead would move in sync. Through a series of experiments, the team says they were able to figure out how this voluntary head motion affects sound perception — it turns out that perception resets just after the motion. The brain has to re-filter all the sounds to come up with a clearer auditory signal. The signal also benefits from the spatial cues your brain picks up during motion — you can better determine the direction sounds are coming from.