Video: In Breakthrough Study, Paralyzed Patients Move a Robotic Arm With Their Own Thoughts

The most complex brain-computer interaction yet
With the BrainGate interface, patient S3, who is paralyzed from the neck down, moved a robotic arm toward a coffee cup, grasped it and was able to take a drink. It was the first time in 15 years that she was able to drink her coffee on her own. braingate2.org

Share

Concentrating deeply, Cathy Hutchinson stared at the tumbler of coffee on the table in front of her wheelchair. A cup-shaped dome on her head powered her small neural implant, capturing signals from her motor cortex as she thought about holding the mug. Slowly, the robot arm began to move.

The elbow swung forward, the wrist turned and the fingers clasped around the cup. A moment later, she took a long drink — the first time since her stroke 15 years ago that she enjoyed a sip of coffee without a caregiver’s help. This feat is part of an ongoing clinical trial using a neural interface system, the first demonstration and the first published study of people using their own brain signals to control a robotic arm. It’s a major breakthrough for neuroscience and engineering, and it could someday help people with paralysis live more independently.

Brain-controlled technologies could restore communication, mobility and independence for patients like Hutchinson, who is identified as patient S3, said Dr. Leigh Hochberg, engineering professor at Brown University and a neurologist at Massachusetts General Hospital. “We are hoping to provide a technology that will translate the intention to move, as decoded directly from brain signals, back into commands to control assistive devices or prosthetic limbs,” he said.

Previous research by this team proved that paralyzed patients could control a computer cursor with their thoughts, and last fall, neuroscientists at Duke Medical Center proved that monkeys could control a robotic arm with their thoughts. This new paper, appearing today in Nature, shows it can work in humans. Hutchinson had the implant for five years, according to study co-author John Donoghue, who has led the development of the technology known as BrainGate. The fact that it worked for so long — both the implant, and her motor cortex itself — is an encouraging sign, he said.

“Fifteen years after her brain became disconnected from her limbs after her brain stroke, she was still able to create all the neural signals,” he said.

The technology is still a long way from widespread use, but Donoghue and Hochberg said in a news conference they were encouraged by its success so far.

To translate the patients’ thoughts, the scientists had to undergo a series of training exercises to decode their neural signals. The two patients watched two separate robotic arms — one developed by the DLR Institute of Robotics and Mechatronics in Germany, and the other by DEKA Research and Development Corp., also known as the DARPA arm. The scientists controlled the arms’ motion, and the patients were asked to imagine themselves making those same movements.

“That elicits a pattern of electricity in their brains, and then we tell the robot, ‘That pattern means move the robot,'” Donoghue said. “When people think about moving, their brain elicits patterns that look to us like what should happen when you actually move, but of course no movement occurs. The motor cortex appears to work in a normal way, even years after an event like a stroke or spinal cord injury.”

Hutchinson, who is 58, suffered a brainstem stroke that robbed her of speech and movement below the neck. She occasionally experiences involuntary movement of her arms, but it’s not controllable. The other subject, a 66-year-old man known as T2, also suffered a brainstem stroke that left him bereft of movement or speech. Initially after their strokes, both patients suffered from locked-in syndrome, limited to small movements of their eyes. Patient T2 communicates by responding to individual letters as an alphabet is read aloud, Hochberg said; Hutchinson has recovered somewhat more, with limited movement of her neck.

As the patients’ neural signals were decoded, they were asked to use the arms to reach out and grasp foam targets that were placed in front of them. Then Hutchinson also tried the coffee experiment. That was on April 12, 2011, Hochberg said. Watch in the video below. Just before the two-minute mark, liquid flows through the straw. The look on her face says it all, Hochberg said.

In 158 trials over four days, she was able to touch the target within an allotted time in 48.8 percent of the time using the DLR robotic arm, and 69.2 percent of the cases with the DEKA arm, according to the paper. In 45 trials using the DEKA arm, T2 touched the target 95.6 percent of the time.

“I just imagined moving my own arm and the [DEKA] arm moved where I wanted it to go,” he said later.

Patrick van der Smagt, head of bionics and assistive robotics at DLR and the TU Munich, said the goal is to use a robotic arm with intuitive motion. Future iterations could provide the arm increased autonomy by decoding a patient’s higher-level intent, he said.

“From the signals you have, you can read more than the movements — you can read intention of the movements. If you are moving toward the cup, it’s clear that you want to go to the cup to grasp it,” he said.

The Department of Veterans Affairs and the National Institutes of Health funded the work.

The ultimate goal is a smaller, perhaps implantable system that would give a paralyzed patient or someone with limb loss full control over his or her environment, Hochberg said.

“The real dream is to one day reconnect brain to limb, to bring these powerful signals from the motor cortex to the peripheral nerves. Someone with paralysis would be able to reach out and pick up that coffee cup with their own limb, of their own volition,” he said.

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.