Actually, the Affective Intelligent Driving Agent (AIDA) won't be riding in the backseat, it's mounted right on the dashboard—but it will make comments on how you drive. It also reacts to your emotional state and helps you navigate.
In other words, AIDA is like your highway helper. A robot pal you can bond with on those long lonely trips. Kind of like a naggy, whiny version of Kitt.
To identify the set of goals the driver would like to achieve, AIDA analyses the driver's mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.Oh, and did I mention that it emotes with facial expressions? I have plenty of relatives that are perfectly willing to bitch about my driving as it is. On the other hand, maybe AIDA will qualify a an additional passenger on HOV lanes—then maybe it can tag along.
'When it merges knowledge about the city with an understanding of the driver's priorities and needs, AIDA can make important inferences,' explains Assaf Biderman, associate director of the SENSEable City Lab. 'Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas,' says Biderman. 'AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.'
CAMBRIDGE, Mass. — MIT researchers and designers are developing the Affective Intelligent Driving Agent (AIDA) - a new in-car personal robot that aims to change the way we interact with our car. The project is a collaboration between the Personal Robots Group at the MIT Media Lab, MIT's SENSEable City Lab and the Volkswagen Group of America's Electronics Research Lab.[Gizmodo via MIT]
'With the ubiquity of sensors and mobile computers, information about our surroundings is ever abundant. AIDA embodies a new effort to make sense of these great amounts of data, harnessing our personal electronic devices as tools for behavioral support,' comments professor Carlo Ratti, director of the SENSEable City Lab. 'In developing AIDA we asked ourselves how we could design a system that would offer the same kind of guidance as an informed and friendly companion.'
AIDA communicates with the driver through a small robot embedded in the dashboard. 'AIDA builds on our long experience in building sociable robots,' explains professor Cynthia Breazeal, director of the Personal Robots Group at the MIT Media Lab. 'We are developing AIDA to read the driver's mood from facial expression and other cues and respond in a socially appropriate and informative way.'
AIDA communicates in a very immediate way: with the seamlessness of a smile or the blink of an eye. Over time, the project envisions that a kind of symbiotic relationship develops between the driver and AIDA, whereby both parties learn from each other and establish an affective bond.
To identify the set of goals the driver would like to achieve, AIDA analyses the driver's mobility patterns, keeping track of common routes and destinations. AIDA draws on an understanding of the city beyond what can be seen through the windshield, incorporating real-time event information and knowledge of environmental conditions, as well as commercial activity, tourist attractions, and residential areas.
'When it merges knowledge about the city with an understanding of the driver's priorities and needs, AIDA can make important inferences,' explains Assaf Biderman, associate director of the SENSEable City Lab. 'Within a week AIDA will have figured out your home and work location. Soon afterwards the system will be able to direct you to your preferred grocery store, suggesting a route that avoids a street fair-induced traffic jam. On the way AIDA might recommend a stop to fill up your tank, upon noticing that you are getting low on gas,' says Biderman. 'AIDA can also give you feedback on your driving, helping you achieve more energy efficiency and safer behavior.'
AIDA was developed in partnership with Audi, a premium brand of the Volkswagen Group, and the Volkswagen Group of America's Electronics Research Lab. The AIDA team is directed by Professor Cynthia Breazeal, Carlo Ratti, and Assaf Biderman. The SENSEable City Lab team includes team leader Giusy di Lorenzo and includes Francisco Pereira, Fabio Pinelli, Pedro Correia, E Roon Kang, Jennifer Dunnam, and Shaocong Zhou. The Personal Robots Group's technical and aesthetic team includes Mikey Siegel, Fardad Faridi and Ryan Wistort as well as videographers Paula Aguilera and Jonathan Williams. Chuhee Lee and Charles Lee represent the Volkswagen Group of America's Electronics Research Lab.
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.