Device reads brain activity to help locked-in people communicate

It can tell whether a completely paralyzed person is thinking “yes” or “no”
A woman models a brain-computer interface that can help people who are completely paralyzed answer yes-or-no questions. Wyss Center www.wysscenter.ch

Share

People with a rare condition called complete locked-in syndrome may finally have a link to the outside world. By reading the brain activity of completely paralyzed people in a new way, researchers could tell whether the test subjects were thinking ‘yes’ or ‘no’ in response to a question. The results were published Tuesday in the journal PLOS Biology.

Four people with complete locked-in syndrome used the new technology to report that they were happy. The findings indicate that these people actually can communicate, with help from the right technology.

The new work is a great proof of principle, says Steven Laureys, who leads the Coma Science Group at the University of Liège in Belgium, and who wasn’t involved in the new study. Communication tools like these can make a big difference in a paralyzed person’s quality of life. “We will see more and more patients in years to come who, through these technologies, will be given a voice,” he says.

Losing touch

People with locked-in syndrome lose control over their muscles but remain fully conscious. The paralysis can happen suddenly, due to a stroke or brainstem injury, or slowly, when motor neurons are destroyed in amyotrophic lateral sclerosis (ALS).

There are a few tools that can help profoundly paralyzed people communicate. Stephen Hawking, who has ALS, twitches his cheek to operate a speech synthesizer. Jean-Dominique Bauby, the editor of the French fashion magazine ELLE, blinked to select different letters of the alphabet while writing his memoir, The Diving Bell and the Butterfly, in the 1990s. Today, sophisticated eye trackers are available.

And then there are brain-computer interfaces, which read brain signals and use them to command computers or robots. People with locked-in syndrome have used them to control computer cursors and spell.

In rare cases, people can’t even move their eyes or eyelids. These people, who are completely locked in, are harder to reach. It’s not entirely clear why they have trouble using brain computer interfaces, which aren’t controlled by muscle movement.

Niels Birbaumer, a neuroscientist at the Wyss Center for Bio and Neuroengineering in Geneva and coauthor of the new research, has an idea about why this may be, although there’s little evidence to support it so far. He proposed that it becomes harder over time for patients to channel their thoughts into voluntary action. “Anything you want, everything you wish does not occur. So what the brain learns is that intention has no sense anymore,” he says. “It is too difficult for them to switch from [a] more reflective state into an attentive state.”

Making contact

Birbaumer and his colleagues made a few tweaks to design a brain-computer interface that would work for people who are completely locked-in.

The team stuck to simple questions that could be answered with a yes or no. “The answer in your head…occurs quickly and it occurs like a reflex. You don’t have to mobilize a lot of resources for such a simple answer,” Birbaumer says.

Most brain-computer interfaces have relied on electroencephalography (EEG), which measures the electrical activity of neurons. Birbaumer and his colleagues instead measured changes in blood flow using a technique called functional near-infrared spectroscopy, which is similar to functional magnetic resonance imaging (fMRI).

“We worked for more than 10 years with neuroelectric activity [EEG] without getting into contact with these completely paralyzed people,” says Birbaumer. “Out of desperation we decided to move to a different type of brain activity and we were lucky.”

EEG did come in handy for reading whether a participant had stopped paying attention or had nodded off. But monitoring the brain’s blood flow could essentially reveal what participants were thinking.

The team started with questions that people would already know the answers to, like their own names or the capital of France. “After about 100 questions of this kind then the computer knows roughly how a yes and a no looks like in the brain of these patients,” Birbaumer says.

When the brain-computer interface registered correct responses 70 percent of the time, the team moved on to open-ended questions. The most important of these questions was, “Are you happy?” All four patients repeatedly said yes.

Their answers aren’t too surprising. These people had already chosen to live by opting for artificial respiration when they could no longer breathe on their own. And previous research on locked-in syndrome also found that people have pretty good quality of life. “The usual things we think we live for—eating, drinking, having free time, enjoying social activities—most of them go back to zero,” says Birbaumer says. “But what stays and what determines quality of life is close family and friend interactions.”

In future, Birbaumer and his team want to build up to more complicated communication, and test the device on more people. They plan to use electric stimulation to boost people’s attention. “We will also implant the electrodes in future in the brain so we need much less energy to decode the thinking of these people,” he says.

It’s promising that the team could decipher yes and no answers, says Stephen Helms Tillery, a neuroscientist at Arizona State University in Tempe who designs neuroprosthetics, and wasn’t involved with this study. “It’s certainly conceivable that this will … enable at least sort of rudimentary communications with people in this complete locked-in [state].”

 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.