We recently gave the Parrot AR.Drone 2.0 a pretty solid review here on PopSci for improvements made to the recreational quadcopter's smartphone- or tablet-based control interface, which we found to be very intuitive. But a team of researchers at Zhejiang University in Hangzhou, China, has gone a long step further.
Robots that can read and respond to brain waves will eventually help stroke patients regain movement, using new neural interfaces that can re-train damaged motor pathways. Neuroscientists have made great strides in brain-machine interfaces that can respond to a person's thoughts -- a new generation will drive a non-invasive robotic orthotic, retraining the patient's own body.
Concentrating deeply, Cathy Hutchinson stared at the tumbler of coffee on the table in front of her wheelchair. A cup-shaped dome on her head powered her small neural implant, capturing signals from her motor cortex as she thought about holding the mug. Slowly, the robot arm began to move.
Paralysis patients could play music with their minds, using a new brain-control interface that senses brain impulses and translates them into musical notes.
Users must teach themselves how to associate brain signals with specific tasks, causing neuronal activity that the brain scanners can pick up. Then they can make music.
Brilliant Ten winner Yoky Matsuoka chats about the potential applications for her ground-breaking robotics research
By Gregory MonePosted 04.25.2008 at 10:39 am 3 Comments
Yoky Matsuoka, the director of the Neurobotics Laboratory at the University of Washington, and one of the honorees in our most recent class of the Brilliant Ten, took some time to chat with Talking Robots about her work in particular, and the future of robotics in general. One of Matsuoka's many projects involves building an anatomically-correct mechanical hand—see the video above of the finger in action—and she also has big ideas about brain-machine interfaces, tele-manipulation and robots in the home.