Back in 2009, we wrote about a little robotic dashboard companion called AIDA (for Affective Intelligent Driving Agent), an MIT creation that essentially read a driver’s facial expressions to gauge mood and inferred route and destination preferences through social interaction with the driver. Apparently that was deemed too distracting, so now MIT is back with AIDA 2.0, which swaps the dashboard robot for a massive 3-D interactive map that covers the entire dashboard–because that’s not distracting at all.

But it is pretty cool. Essentially, AIDA 2.0 would aid the driver by turning all of that unused dashboard real estate into a gesture-controlled three-dimensional display that can control everything from the stereo to the AC, as well as display mapping information in the driver’s peripheral.

Like its predecessor, AIDA 2.0 also learns your route and destination preferences and habits. So along with route and destination data, it also essentially tries to determine your goals and objectives for a given trip and optimizes the display to help you execute those plans. All that is augmented by real time road conditions, weather, traffic conditions, etc., all laid out prominently in front of the driver (the display even overlays onto the rearview mirrors). How could one become distracted?