A system first made for robot navigation could give blind people the equivalent of a Braille head-up display, according to French researchers. Two cameras mounted to a pair of glasses generate a three-dimensional image of a person’s environment and their place in it, displaying the information on a handheld Braille device.
AIDA reacts to the driver's facial expressions and other cues, responding in the proper social context.
With all the sensors, computerized gadgetry and even Internet connectivity being built into cars these days, it's a wonder our automobiles aren't more like Optimus Prime. Our cars will now email us when they need to have their oil changed, and recognize our facial expressions to determine whether we're enjoying ourselves, but for all the information available to us when we're driving, it's often not possible to organize it all in real-time and package it in a way that we can digest while behind the wheel. Researchers at MIT and Audi created the Affective Intelligent Driving Agent to address exactly that problem.
And you thought that lady from your car's navigation system was stern. Human-machine interface Researchers from Kajimoto Laboratory came up with this GPS navigation helmet that doesn't give directions in words, it "shows" the wearer which way to go by tugging on the appropriate ear, just like mom used to do. "Being pulled on the ear for navigation is a common situation when we were children," researchers write, "and hence, the sensation should be quite intuitive."