SHARE

We may earn revenue from the products available on this page and participate in affiliate programs. Learn more ›

Last year, we reported on Microsoft’s Surface Computing touchscreen in our Best of What’s New issue. It looks like a coffee table-sized iPhone, only instead of using your fingers on the screen to scroll and zoom, you can use your fingers to grab, move, sort, and rotate any number of items you see. As for the Wii, well, everybody knows about the Wii by now—its controllers use an accelerometer and infrared sensors to figure out where and how quickly you’re pointing at your television. Now imagine those two things mashed together–without any external devices.

That’s the idea behind the iPoint Presenter, a system developed by two Fraunhofer Institutes and which will be presented it at the massive computer expo CeBIT in Hanover this week. The computer allows a person to stand in front of a large projection screen and manipulate the objects she sees with only her gestures. No touching and no pointing devices. Cameras track the user’s movements and the software determines the rest: what her fingers are doing, how far she is from the screen, and whether she is using “multipointing interaction,” which are complex commands using multiple fingers. It was developed to allow anyone to operate it intuitively and without the need for spoken words.

The possibilities for a device like this are to say the least, limitless. Aside from increasingly mind-bending video games, the researchers hope for a great usefulness in presentations, especially in large spaces where the speaker is physically too far from the screen and projector to use traditional pointers and remotes. They are now hard at work expanding the library of human gestures computers can interpret and developing new ways for the computers to learn to recognize the movements themselves.

Via PhysOrg