Paralyzed Woman Can Eat A Chocolate Bar, With Graceful Mind-Controlled Prosthetic Arm

University of Pittsburgh researchers have allowed a paralyzed woman to pick up and reorient objects--and even feed herself--by controlling a robotic arm with her thoughts.
University of Pittsburgh Medical Center

Researchers at the University of Pittsburgh have created a prosthetic arm that is the most sophisticated mind-controlled prosthesis ever created. Using a mix of cutting edge hardware and complex programming, the team has enabled a 52-year-old woman paralyzed from the neck down by a degenerative neurological disorder to move a robotic arm and hand with a degree of nuance and fluidity never before seen.

That’s not just a boon for the prosthetics community of course, but for the whole discipline of brain-machine interfaces–a field that is enjoying both an influx of funding (in the wake of two wars that have seen many soldiers lose appendages to improvised explosive devices) and huge leaps in capability thanks to better algorithms that can translate brain signals into the appropriate mechanical movements.

The U. of Pittsburgh arm relies on just two microelectrodes implanted in the patient’s left motor cortex based on functional MRI scans that pinpointed the exact nerve clusters that lit up when they asked the patient to think about moving her arm and hand. A complex set of algorithms then turns the brain signals into their corresponding movements, allowing the patient to not only move the arm but to pick up and reorient objects–with a 91.6 percent rate of success (video here). She was even able to feed herself a chocolate bar. It took her just two weeks to gain full control of the hand (though she was able to move it after just two days), and her speed increased with practice, suggesting that both the algorithms and the human brain are capable of improving performance of these kinds of brain-machine interfaces over time.

The next steps will be to incorporate wireless technology into the system and perhaps integrate some kind of sensory data into the systems so the patient can feel things like temperature, texture, and pressure.