Dartmouth College researchers are giving a whole new meaning to the word iPhone. Ahem Make that Eye-Phone.
Eye-tracking could soon be coming to cell phones, allowing hands-free control of mobile devices that goes way beyond voice activation, Technology Review reports.
While eye-tracking is nothing new — it’s been used for years to allow people with disabilities to use computers, by advertisers, and by the military — eye-tracking is difficult on a small, moving device like a cellphone.
The Dartmouth team, led by professor Andrew Campbell, devised a new algorithm that learns to identify a person’s eye movements under different conditions.
First, you have to calibrate the system by snapping pictures of your eyes both indoors and outdoors. During a learning phase, the software is trained to recognize eye movements in various lighting situations.
Running on a Nokia N810 tablet, EyePhone tracks the position of the user’s eye relative to the screen, rather than where a person is looking. The software divides the camera frame into nine regions and looks for the eye in one of those regions, Tech Review reports.
The user has to arrange the phone so a virtual “error box” is situated around his or her eye; the system can recognize the eye as long as it stays in this box. Blinking equates to a mouse click, allowing users to choose an application.
The team will present a paper (PDF) on their findings at a workshop in New Delhi, India, in August.
It’s a pretty basic system, but the researchers hope to develop more advanced methods. That shouldn’t be too hard for Campbell, whose lab has previously dabbled with something it calls NeuroPhone — an iPhone that taps into your brain. That system uses an off-the-shelf wireless EEG headset to control an iPhone. A mind-controlled contact app flashes photos of contacts, and when the user sees the person she wants to call, her brain activity triggers the system and tells the iPhone to dial that person.