Brain-machine interfaces hold potential for a variety of ends, from helping the neurologically or physically disabled communicate and interact with their environments, to creating thought-controlled computers that augment the brain with computing power. A group of researchers at Columbia are turning that model on its ear, using brain power to augment computing tasks. Their device couples the human brain and computers to perform tasks neither could do as efficiently on their own.
The device, known as C3Vision (cortically coupled computer vision) taps into the fast processing power of the brain to help computer programs manage complex problem, particularly those posed by image recognition. An electroencephalogram (EEG) cap on the head of a human user is used to detect neurological signals in the brain. The computer then flashes images up on the screen at a rate of about ten per second. The conscious brain doesn't even have time to adequately consider each image, but the subconscious is hard at work.
The system is great at working our problems that computer language has a problem tackling. For instance, it's easy enough to search for a picture of a bicycle on the Web, but it's far more difficult for a search engine like Google or Bing to search for something that looks "odd" or perhaps "silly." The brain, however, can take these less-defined, more abstract qualifiers and very quickly assess whether or not an image fits the term.
The conscious brain doesn't even have to get involved. The images flash too quickly for a person to rate his or her interest in each one, but the visual pathways in the brain move much more quickly. Machine-learning algorithms can quickly detect the neurological signals that represent the brain's interest in a given image, and helps the computer to rank the images for interest. If the person sees something interesting or different, the computer knows it even if the person does not.
As such, the system has been used in tests to accurately scan satellite images for the presence of surface-to-air missiles faster than either a human or a machine could alone. Which accounts for DARPA's interest in the technology; the DoD research arm has sunk $4.6 million into the development of the tech via a spinoff from the university. But the tech could also be used for a variety of other tasks that require the analysis of large volumes of visual data.
"The images flash too quickly for a person to rate his or her interest in each one, but the visual pathways in the brain move much more quickly. Machine-learning algorithms can quickly detect the neurological signals that represent the brain’s interest in a given image, and helps the computer to rank the images for interest."
Combine the potential in that quote to some weird advanced tech that mates a Micro$oft Kinect to an EEG and suddenly marketers would have the ability to monitor your subconcious at will.
Yeah, its a long shot...or at least a ways down the road, but who here honestly believes that corporate America wouldnt slit throats to get their hands on something like that?
Either that or Im just paranoid and two steps away from wearing a tinfoil helmet everywhere I go.
Kinda makes me think of those women at the computer consoles in "Ghost in the Shell". Flash information quickly on the screen and connect the computer to the brain to process the information quickly to get results.
"Either that or Im just paranoid and two steps away from wearing a tinfoil helmet everywhere I go."
I'm just waiting for the article "Scientists discover how to use tinfoil as EEG"
Reading the article, I have a vision of a dystopian future where unwitting people are forced to succumb their mind's computing power to giant computer clusters, matrix-style.
But that'll never happen.
DARPA is working on a new Intersect, Intersect 3.0!
Chuck is due for an upgrade next year!
I would be happy with a set of those expando-fingers. Just imagine the possibilities... *wipes away bit of drool*
What is funny is that I originally began watching 'Ghost in the Shell' after someone made another reference to the series in a comment to another article somewhat like this one not to long ago. That series did give us a lot to reference...
Personally, I look forward to being a cyborg in the future.
I concur! On all counts!
And ghost In the shell is bad ass
If corporate America could put this technology into a device like the Kinect (which, i agree is f***n scary), it would probably use this to mine the minds of the general public for more effective advertising methods. I know that similar EEG research has already been done involving scary movies and determining what kinds of situations or visual effects are most scary.
Also, this seems like a very effective method for facial recognition of criminals in places like airports or other high traffic areas.
All in all, pretty cool stuff.
Maybe someone could use it to make a decent spam filter.
i don't need an upgrade, thank you, none the less
In Soviet Russia, you have The Matrix.
("The Matrix has you")
Great article, good to read news day by day about BCI technology. To get more of them, please visit www.neurogadget.com too.