(PhysOrg.com) -- If you remember the AIDA (Affective, Intelligent Driving Agent) system, which came out roughly a year and a half ago, then you remember that it was a joint project, made by MIT and Volkswagen, that put a robot head in your dashboard. The head gave driving directions to end users. The newest version, AIDA 2.0, has gotten rid of the talking head, and turned the entire view of the car into one large navigation display.
In the AIDA 2.0 system all of the information that the driver needs will be placed onto the dashboard and surrounding areas. While this will make the information easily accessible, it may also lead to potential distractions on the road. The new virtual display now consists of the entirety of the dashboard, the console, the instrument panel, and the wing mirrors. Working in conjunction, they create one virtual display that is able to update itself as you move.
While this idea does seem really cool, like something out of a Tron movie, it does stretch the drivers view, and could potentially distract from the stretch of road in front of the driver, and the other cars on the road.
On the bright side, the system is both adaptive and considerate. The system will, over time, learn facts about you such as the types of places where you like to eat and the activities that you are interested in. Then, it will search through information about the area and tell you about things that you may be interested in that are close by. As with any adaptive system, the more you use it, the better it will become.
No word has been given yet about when consumers will see the AIDA 2.0 system in cars.
Explore further: Smart scarf carries multimodal language to convey emotions
More information: senseable.mit.edu/aida/