Imagine a gesture-based mobile device with no screen, no keyboard, and no other peripheral inputs or outputs, a mobile device that's not really a device at all. Can you see it in your mind's eye? If so, you're probably picturing something akin to a new "imaginary" interface envisioned by a German research student who wants to let users imagine their own graphical interfaces, operating their conjured keyboards via spatial memory and proprioception.
The imaginary interface aims to solve two problems faced by existing mobile devices. First, currently users are required to use a touchpad, mouse, or buttons to navigate preconfigured options on a screen. In other words, we learn the devices, rather than the other way around. Secondly, these devices are limited by their physical nature; buttons and screens can only become so small before they become unusable.
Other gesture-based interfaces have attempted to circumvent these problems to some degree, but they still rely on a screen or some kind of projection device to create a visual interface that a user must learn and interact with. Sean Gustafson of Potsdam University's Hasso Plattner Institute wants to completely untether the mind from the device.
The only physical device needed is a two-inch square device that attaches to clothing on a user's chest (eventually Gustafson and company would like to shrink it to the size of a button). A ring of LEDs projects infrared light into the space immediately in front of the user so a camera in the device can interpret gestures. The user can then imagine his or her own interface; for instance, by making an "L" shape with the non-dominant hand the user creates an imaginary plane in which the interface will exist. From there, the device has to do some learning, but the user can place buttons -- say, a "send" button for messages or an area for making simple sketches with the hands -- wherever he or she wants in space.
It seems difficult within the current paradigm of visual interfaces, but Gustafson believes that if a user places elements in his or her own interface, that person will remember that placement both visually and proprioceptually (call it muscle memory mixed with the spatial perception of the limbs). Of course, if a user truly forgets the location of something within the interface, it can simply be redrawn.
The system still needs some tweaking of course -- the infrared doesn't work well in bright light, and the it's ability to perceive highly precise gestures is less than perfect -- but the idea is intriguing. The mobile interfaces of the future might be more than mobile in the it-fits-in-your-pocket sense; anywhere your imagination goes, your own custom mobile interface goes as well.
Now we wonder if the man is speaking on a phone we don't see or just talking to himself. Coming up next: is he designing something on his pocket computer or should we call an ambulance?
mean while this guy actually does it.
the whole imagining thing is pointless
imagining ya right....
not only he has to learn now,he has to also remember what he imagined.....
unless we project directly into our minds and give commands thru mind,we will always need these devices to interact....
the next logical step would be to create a device much like a cell phone, multi media device, storage etc... along with either a pair of sun glasses or contact lens the could receive a signal from the device and then project the image out in front of you, along with the ability to use hand motions n menu's etc... If anybody has played the game heavy rain, this is along the lines of what im thinking. i rekn this tech will b round in 10 to 15 years. were already making very basic contacts that display 8x8 black and white, give that time to come along. and then we just have to wait till we can get that sort of processing power and os working nicely.
tell me what u think?? ?
I'm with rothm4n. Display glasses that read your hand movements in front would put all the functions in a neat little package.
I think the real next step would be neural interfacing via a small wire that runs from the back of the frame on the glasses and plugs you directing into the process without the need to move your hands at all. At that point he level of production would only be limited by the speed/precision of your thoughts (and connection speed) instead of ungainly physician actions.