A Robot for My Co-Pilot

In-Dash Robot Uses Facial Expressions To Communicate With Driver

The Wired (11/17, Squatriglia) “Autopia” blog reported, “Audi and the Massachusetts Institute of Technology envision a future where robots riding shotgun make us happier, safer drivers and create a ‘symbiotic relationship’ between car and driver.”  The robot, called Affective Intelligent Driving Agent, or Aida, “would analyze our driving habits, keeping track of frequent routes and destinations to provide real-time traffic info, and make friendly suggestions along the way,” as well as “give gentle reminders to buckle up, watch our speed or slow down for that school bus up ahead.”  The robot “uses a small laser video projector to convey facial expressions and other information.”  Having “human-like motion” and the ability to express emotions, researchers say, “makes it easier to convey information,” since “reading a facial expression is instantaneous.”  The researchers “plan to build a driving simulator for a controlled study” by next year, and “real-world tests will follow in 2011.”

Reposted from the November 18, 2009 ASEE First Bell briefing.

Advertisements

RSS Feed

November 2009
S M T W T F S
« Oct   Dec »
1234567
891011121314
15161718192021
22232425262728
2930  

%d bloggers like this: