MIT researchers and designers are developing the Affective Intelligent Driving Agent (AIDA) – a new in-car personal robot that aims to change the way we interact with our car. The project is a collaboration between the Personal Robots Group at the MIT Media Lab, MIT’s SENSEable City Lab and the Volkswagen Group of America’s Electronics Research Lab.
AIDA communicates with the driver through a small robot embedded in the dashboard. AIDA is developed to read the driver’s mood from facial expression and other cues and respond in a socially appropriate and informative way.
AIDA communicates in a very immediate way as well: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.
To identify the set of goals the driver would like to achieve, AIDA analyses the driver’s mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.
It merges knowledge about the city with an understanding of the driver’s possible priorities and needs, and based on these learned facts, AIDA can make important inferences. Within a week AIDA will have figured out your home and work location. Soon afterward, the system will allegedly be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas. AIDA is also said to also be able to give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.