Double The Fun: Monkeys Control Two Virtual Arms With Just Their Thoughts

It's yet another step toward creating lifelike, mind-controlled prosthetics for those who have lost use of their limbs.

Those old Doublemint chewing gum ads got it right: Two is better than one. That’s the idea behind a new study, published today, that showed off a system that hooked up monkeys’ brains directly to a computer. On the computer screen, the monkeys saw two avatar arms. Motivated by a few sips of juice if they got it right, the monkeys were able to move those arms to targets just by thinking about it. After training, they could produce the movements while keeping their real arms still. It was like playing a first-person computer game, but instead of tapping out commands on a keyboard, the monkeys controlled their avatar arms with their thoughts.

Previous experimental systems have allowed both people and monkeys to control one arm, whether on a computer screen or a physical, robotic arm. This is the first time researchers have been able to add in a second arm.

In the future, experimental systems like this could provide those who have lost the use of their arms or legs with helpful, lifelike prosthetic limbs. The science of mind-controlled prosthetics is still in its early stages, however, in spite of years of work. “We have high hopes, but we need to be patient,” says Daofen Chen, the program director for systems and cognitive neuroscience at the U.S.’ National Institute of Neurological Disorders and Stroke. Chen was not involved in the developing the new two-armed system. “Brain research, it’s really a long haul.”

The new system, developed at Duke University in North Carolina, offers the beginning of coordinated, two-armed movement. In experiments, two rhesus monkeys learned to move their avatar hands to two white squares or circles on screen and hold position for anywhere from 40 milliseconds to one second to get a juice reward. The controls for the movements came from electrode arrays implanted in the monkeys’ brains. The arrays were connected to devices, loaded with software researchers wrote, that amplified the monkeys’ brain signals and translated them into on-screen movements.

There’s still a long way to go before such systems are able to mimic two-handed tasks people perform all the time, like, say, opening a jam jar. The monkeys didn’t have to control their avatar hands at all; they just had to move the arms. The level of coordination required to hold their arms in place for a little while was also pretty simple, even compared to the two-direction twisting you have to do to open a jar.

“Their study is a significant step forward. It could lead, eventually, to more complicated tasks,” Chen tells Popular Science. “But I think it needs to be pointed out that I don’t want to—I mean you don’t want the reader to believe bimanual tasks are that simple.”

The two-armed system begins to solve an entirely new problem from just moving one arm, the Duke researchers said in a statement. (The lead researcher, Miguel Nicolelis, wasn’t able to talk in time for the publication of this story.) Studies in non-human primates show the brain doesn’t simply double the signals it sends when it does two-handed things. Instead, the brain produces patterns of activity specific to two-limb movement. The Duke team recorded the output of almost 500 neurons in each of their monkeys to learn more about how the brain creates two-armed movement—and to check if 500 neurons was enough to drive two-armed moves.

They published their work today in the journal Science Translational Medicine.