Researchers at the University of Minnesota today revealed a drone that can be controlled merely by thought, and that's not even the coolest thing about it. Published in the Journal of Neuro Engineering, the project has implications in everything from unmanned vehicles to paraplegic mobility.
The setup here is pretty basic, futuristic though it seems. The drone is a commercially available four-blade helicopter--the Parrot AR quadrotor--which is basically a drone hobbyist's Model T. To control it, the "pilot" wears a funny hat, the sensing end of an electroencephalogram (EEG). EEGs place an array of electrodes over a person's head, in a totally non-invasive way, then pick up on electrical activity in the brain. Clusters of activity, like thinking about making a fist with a right hand, generates a spark in a specific area of the brain. That spark gets translated through a computer into a quadrotor command ("turn right"). The command is then beamed to the quadrotor via WiFi.
Ideally, it works like in the image below:
Previously, researchers showed that subjects could control a virtual helicopter with their thoughts. The latest demo--using a real, live helicopter--is just another step toward more practical applications, with the ultimate goal being to help people with disabilities and neurdegenerative dieseases regain mobility, says researcher Bin He, a professor of biomedicial engineering at the University of Michigan.
Why demonstrate the system with a drone? "It's more manageable to test in protocol than it is to do on a patient with a prosthetic arm," he says. In the published paper, the pilot is illustrated as someone in a wheelchair, sitting in front of a laptop that's streaming back everything the drone sees. A thought-controlled wheelchair would probably help that person out more, but ground-bounded wheelchairs rarely encounter something both drones and arms deal with every day: precisely navigating three dimensional space. Flying a drone through a hoop turns out to be good practice for maneuvering a hand to a mouth, say, or putting an arm through a sleeve.
The system is relatively easy to use. A series of exercises trained the five students who piloted the drone. The first is a game that looks like pong, where the subject used the EEG controls just to move a dot on a screen right or left. The second controlled up/down motions, a third combined them all in a similar, pong-like interface, and the fourth exercise before actually piloting the drone involved a flight simulation of the drone through hoops above an imaginary town.
There's a funny story about the sixth test pilot. The researchers were asked to put together a video of how the drone worked, and on short notice they couldn't find one of the original test pilots. Instead, graduate student Brad Edelman had passed the first couple of tests, and he happened to be in the lab at the time, so they sat him down to try out the full thing. As you can see below, he flew it perfectly, suggesting that the technology is intuitive enough for some people to just pick up and use it.
Can I place an order for, oh say, 10,000? Oh and can they be equipped with laser beams?
Jokes aside, this is awesome; more expanse into using brain wave readings to control machines. This will only continue to develop and evolve. Our brains are fascinating machines in their own rite, fully capable of learning to use new constructs like machine interfaces. Brain(thought)/machine interfaces will only become more common place in the next 10 to 20 years; entertainment, military, transportation, medical, manufacturing, just a few of the industries that will be able to take full advantage of the technology as it progresses.
Im curiouse as to how simple to use this really is. I mean does the user have to imagine himself closing his hand so much that hes incapable off thinking of anything else? Id love for this technology to maybe branch out and be used with lets say... the Occulus Rift. However I dont see it being a valuable controller unless the actions feel somewhat natural and not forced.
Louann writes in her book that men think about sex every 52 seconds.
Louann Brizendine, author of "The Female Brain
I agree that this is a terrific, and frankly thrilling, preliminary step. But to make a significant difference to disabled people, it has to evolve into something that can interpret multitasking. I can talk to a person (threaten them, maybe) while simultaneously making a fist. I suspect the EEG equipment is years away from being able to differentiate discrete and simultaneous motor control actions - say, chewing one's food while at the same time lifting the hand to the mouth, delivering the next bite. It will take quite a bit of user training as well as a lot more brain-activity-tracking software development to reach the next level.
At this stage of development it looks like a really cool toy.
@Zemilio, that comment doesn't seem to apply to this article, but I bet the elapsed time between such male thoughts isn't really that long.
Remote controlled Helicopters today. Gundam suits tomorrow!
Seriously, this could do wonders for people with spinal cord injuries. At the same time it's kind of creepy when you look at how far you can take it. Imagine having a smaller version of this implanted in your brain being able to read when you experience pain and the being able to send tiny robots in your bloodstream to that location to repair the damage.
Wow, this is like MindFlex on caffeine. I'd love to see a drone battle with two or more folks. But Mindflex makes me VERY tired and gives me a slight headache after playing for more than 15 minutes or so.
I wonder if there are any side effects from this?
Love, Peace & Soul
Very nice idea for disabled people like me. They should make vaccuum cleaners etc.
Btw where can I buy one and how much will it cost me.