One of the problems facing people with autism is an inability to pick up on social cues. Failure to notice that they are boring or confusing their listeners can be particularly damaging, says Rana El Kaliouby of the Media Lab at the Massachusetts Institute of Technology. "It's sad because people then avoid having conversations with them."
The "emotional social intelligence prosthetic" device, which El Kaliouby is constructing along with MIT colleagues Rosalind Picard and Alea Teeters, consists of a camera small enough to be pinned to the side of a pair of glasses, connected to a handheld computer running image recognition software plus software that can read the emotions these images show. If the wearer seems to be failing to engage his or her listener, the software makes the hand-held computer vibrate.
In 2004 El Kaliouby demonstrated that her software, developed with Peter Robinson at the University of Cambridge, could detect whether someone is agreeing, disagreeing, concentrating, thinking, unsure or interested, just from a few seconds of video footage. Previous computer programs have only detected the six more basic emotional states of happiness, sadness, anger, fear, surprise and disgust. El Kaliouby's complex states are more useful because they come up more frequently in conversation, but are also harder to detect, because they are conveyed in a sequence of movements rather than a single expression.
Her program is based on a machine-learning algorithm that she trained by showing it more than 100 8-second video clips of actors expressing particular emotions. The software picks out movements of the eyebrows, lips and nose, and tracks head movements such as tilting, nodding and shaking, which it then associates with the emotion the actor was showing. When presented with fresh video clips, the software gets people's emotions right 90 per cent of the time when the clips are of actors, and 64 per cent of the time on footage of ordinary people.
El Kaliouby is now training the software on excerpts from movies and footage captured by webcams. This week she plans to gather the first on-the-move training footage by equipping a group of volunteers, some of whom are autistic, with wearable cameras.
Getting the software to work is only the first step, Picard warns. In its existing form it makes heavy demands on computing power, so it may need to be pared down to work on a standard hand-held computer. Other challenges include finding a high-resolution digital camera that can be worn comfortably, and training people with autism to look at the faces of those they are conversing with so that the camera picks up their expressions.
The team will present the device next week at the Body Sensor Network conference at MIT. People with autism are not the only ones who stand to benefit. Timothy Bickmore of Northeastern University in Boston, who studies ways in which computers can be made to engage with people's emotions, says the device would be a great teaching aid. "I would love it if you could have a computer looking at each student in the room to tell me when 20 per cent of them were bored or confused."
"This article is posted on this site to give advance access to other authorised media who may wish to quote extracts as part of fair dealing with this copyrighted material. Full attribution is required, and if publishing online a link to www.newscientist.com is also required. The story below is the EXACT text used in New Scientist, therefore advance permission is required before any and every reproduction of each article in full. Please contact email@example.com. Please note that all material is copyright of Reed Business Information Limited and we reserve the right to take such action as we consider appropriate to protect such copyright."
THIS ARTICLE APPEARS IN NEW SCIENTIST MAGAZINE ISSUE: 1 APRIL 2006
Author: Celeste Biever
IF REPORTING ON THIS STORY, PLEASE MENTION NEW SCIENTIST AS THE SOURCE AND, IF PUBLISHING ONLINE, PLEASE CARRY A HYPERLINK TO: http://www.newscientist.com
UK CONTACT - Claire Bowles, New Scientist Press Office, London: Tel: +44(0)20 7611 1210 or email firstname.lastname@example.org
US CONTACT ĘC New Scientist Boston office: Tel: +1 617 386 2190 or email email@example.com
Last reviewed: By John M. Grohol, Psy.D. on 21 Feb 2009
Published on PsychCentral.com. All rights reserved.