As you listened to your colleagues' conversations at work today, or to a podcast on the train home, or to your personal trainer shouting lift, your brain completed some complex tasks. The frequencies of syllables and whole words were decoded and given meaning, and you could make sense of the language-filled world we live in without actively thinking about it. Now a team of researchers from the University of California at Berkeley has figured out how to map some of these cortical computations. It's a major step toward understanding how we hear — and a possible step toward hearing what we think.
By decoding patterns of activity in the brain, doctors may one day be able to play back the imagined conversations in our heads, or to communicate with a person who can think and hear but cannot speak.
Brian Pasley and colleagues at UCB worked with 15 volunteer patients who were being treated for epilepsy. The team also included researchers from UCB, UC San Francisco, the University of Maryland and The Johns Hopkins University. To diagnose the seizures' places of origin, surgeons implanted electrodes directly onto the patients' brains, providing a rare opportunity to study electrical signals in various brain regions. Pasley said the research team visited patients in their hospital rooms and played them recorded words while monitoring activity in the superior temporal gyrus, a region of the auditory cortex.
"We're looking at which brain sites become active. Because we can determine some association between those brain sites and different frequencies, we can watch what brain sites are turning on and off for these recordings, and that lets us map back to the sound," he said.
Since neurologists can know the frequencies of certain phonemes — specific language sounds — this cortical spectroscopy can decode which sounds, and then perhaps which words, a person is hearing. Pasley compared it to piano playing: "If you're an expert pianist, you know what musical notes are associated with each piano key, and you understand that relationship between the key and the sound," he said. "If you turn the sound off, and have the pianist watch which piano keys are being pressed, this expert would have an idea what sound is being played even though they can't hear anything."
The patients would hear a single word or a single sentence that would fall in the range of normal speech, between 1 and 8,000 Hz, Pasley said. Words were spoken by people of both genders and a wide range of voice frequencies. As they listened, the patients' brain activity was recorded. Then Pasley developed two computational models that crunched the electrode recordings and would predict the word being heard. One of the two methods could create a reproduced sound so close to the original word that Pasley and his colleagues could guess what it was 90 percent of the time, he said.
"It's not intelligible, but you can identify some similarities," he said. Watch the video below to hear what he means.
Neuroscientists have long been trying to decode the inner workings of the brain, associating neurons in the sensory cortices with stimuli that fire up those neurons. But the newest research, along with this paper, peers more deeply into the recesses of our minds, promising to illuminate thoughts so they can be seen and shared with others.
In December, Boston University researchers published research explaining how they stimulated patients' visual cortices and induced brain patterns to create a learned behavior, even when the subjects did not know what they were supposed to be learning. Last fall, Jack Gallant — also at UCB — published a paper describing the reconstruction of video images by tapping the visual cortices of people who watched the videos.
This form of mind-reading, which neurologists prefer to call "decoding," is a long way from everyday use. And there are clearly some ethical questions surrounding its use (although it would be hard to implant electrodes to peep in on an unwilling person). But there are some practical, medically motivated reasons to do these things, like communicating with locked-in patients, or those who have lost the ability to speak because of a stroke or a degenerative muscle disease. That depends on some other vagaries of the brain that are still not well understood, Pasley said. Development of neural prostheses depends on the assumption that brain activity is the same during real experiences and imagined ones.
"There is some evidence that when people imagine visual stimuli or sound stimuli, some of the same brain areas do seem to activate as when you are actually looking at something or hearing something," he said. "But we didn't have a good idea at all, even if the same areas are activating, if they are processing the same way, and using the same rules, as during perception."
In this study, the researchers only focused on English words and phonemes, but Pasley would like to study other languages too, he said. The paper appears in the journal PLoS Biology.
This is great! People's thoughts could be decoded! Locked in patients would finally have a chance to communicate to the rest of the world! And dreams could also be recorded!
what i want is to see this used to mimic a human being for the better part of a month.
i'd build a box that would be glued to the back of your head, over the course of a month it would read everything you thought and transmit it back to a computer. there i would clean up a few things like specific decisions and moral hijinks so that i could clone a personality and bam instant artificial intelligence.
set up a robot up-link it to a base computer and have the emulation software on that base. instant lab minion! take the mad scientist approach to any problem, build your solution!
to mars or bust!
After they figure out how to decode, then the will try to figure out how to write. That can be good for prosthetics, video games, science, and mind control.
I wonder if people who speak different languages have the same brain waves for same things.
Say goodbye to the polygraph and hello to you are f***ed if you lie!!!! Bwahahahahaha. Hook that up to our pres. Canidates!!!!!
Once it is perfected to read the thoughts of human minds, then it will just a matter of time when we connect that invidual to a computer and process inputs into his mind. Computers then will be programming people!
Why have you published a science article without a source?
SOURCE (Open access on PloS)
Kucewicz MT, Tricklebank MD, Bogacz R, & Jones MW (2011). Dysfunctional prefrontal cortical network activity and interactions following cannabinoid receptor activation. The Journal of neuroscience : the official journal of the Society for Neuroscience, 31 (43), 15560-8 PMID: 22031901
More videos and podcast from the researchers here: http://neurobonkers.com/2012/02/01/video-uc-berkeley-scientists-reconstruct-speech-from-brain-waves/
my best friend's step-aunt makes $68/hr on the internet. She has been out of a job for 10 months but last month her income was $8959 just working on the internet for a few hours. Go to this web site and read more...ao.co.za/V460LB
dove @ 7,000hz clear coding at frequncies bio-compatible to sustained configuration.
So,... Which government is likely to be the first one to abuse this technology and violate our right to privacy? OURS' of course. But, you probably figured that out when you read the title.
old news. I have been eavesdropping on my neighbors' brains for the last three years. Just get a device that captures ultra low RF's and point it at your victim's head. Convert the RF's to sound. Easy. Thanks to Ken Trussell from Launchpad for helping me spying on my neighbor.