That's the idea behind the iPoint Presenter, a system developed by two Fraunhofer Institutes and which will be presented it at the massive computer expo CeBIT in Hanover this week. The computer allows a person to stand in front of a large projection screen and manipulate the objects she sees with only her gestures. No touching and no pointing devices. Cameras track the user's movements and the software determines the rest: what her fingers are doing, how far she is from the screen, and whether she is using "multipointing interaction," which are complex commands using multiple fingers. It was developed to allow anyone to operate it intuitively and without the need for spoken words.