Mind-Reading Algorithm Knows What You’re Looking At

An algorithm that analyzes brain signals can accurately determine what people see in real time, according to new research.

“We’re decoding the human perceptual experience — your ability to see different things and understand what they are,” lead author Kai Miller told Popular Science.

The researchers used electrodes to measure brain signals in seven volunteer subjects. They showed the subjects images of human faces and images of houses in 400-millisecond-long flashes, instructing them to report whenever they saw an upside-down house, in order to make sure that they stayed focused on the task at hand.

Then, computer software measured two types of brain signals: event-related broadband, which occur when neurons act out of sync; and event-related potentials, which result from neurons firing together.

The algorithm was then “trained” by looking at 200 responses from subjects during the test period. After that, the researchers asked the algorithm to predict on its own the next 100 responses of the test subjects. With only about a half-second of delay, it was able to quickly and accurately predict when subjects were looking at a picture and what type of picture they were looking at.

Miller says that there have been past successful attempts to decode elements of what people see using brain activity, but that those algorithms needed to be told that the subjects were looking at something. This algorithm, however, is capable of detecting that on its own.

When asked whether he would call it mind reading, Miller notes, “It is a type of mind reading, but it’s not as nefarious as that would sound.”

Especially, he says, given that these findings could contribute in part to the longer term goal of rehabilitating the brain following a stroke or removal of a tumor.

“We can actually start to understand dynamics of brain areas,” he explains. “It could be used to understand diseases where diseases spread throughout the brain.”