20 May AAAS lecture: Computers that Respond to Human Emotion

05/18/04

Washington, D.C. -- Researchers are building computers that can detect and respond to a person's emotions. Prototypes can measure emotional expression through physiological signals such as facial expressions and voice changes, and respond with human-like skills such as listening, empathy and sympathy. The research and technology behind emotionally responsive machinery is state of the art, and controversial.

On Thursday, 20 May, AAAS, the science society, will host a public lecture on "Affective Computing: Toward Computers that Recognize and Respond to Human Emotion." It begins at noon, is free and open to the public, and will be held in the AAAS auditorium located at 1200 New York Avenue N.W., Washington D.C.

What might be the personal and social impact of these products?

"If my 5-year-old loses too many games of tic-tac-toe in a row, and I sense her frustration, I might decide to let her win one," said Connie Bertka, director, AAAS Dialogue on Science, Ethics and Religion (DoSER). "Maybe my computer will be able to do the same for me. But do I want it to be able to make that choice?" DoSER seeks to help the public generally, and the religious communities particularly, understand advancements in science and technology, such as affective computing, and to use this understanding as the foundation for exploring the ethical and religious implications of these advancements.

"I was first interested in building computers that could be smarter in that they would be able to process auditory and visual information at the same time," says Rosalind W. Picard, founder and director, Affective Computing Research Group, Massachusetts Institute of Technology (MIT) Media Laboratory; and co-director, Things That Think Consortium. To do this, she started researching how the human brain is able to process multiple senses simultaneously. "I kept bumping into lower level brain structures that help weigh things and that have a lot to do with human emotions. In a similar fashion, it looked like computers could be 'fixed' if they could also incorporate emotions."

According to MIT's Affective Computing Research Group, perhaps the most fundamental next-generation application will be a human interface whereby a computer can recognize, and respond to, the emotional states of its user. A user who becomes frustrated or annoyed with a product would "send out signals" to the computer, at which point the application would respond ideally in ways that the user would see as "intuitive." For example, a computer piano tutor might change its pace and presentation based on naturally expressed signals that the user is interested, bored, confused or frustrated.

"So instead of an interface emotion going only one way, like smiling paper clips that pop up on your screen, it needs to be responsive to our emotions," Picard explains. "If we send it an emotional signal saying 'we don't like this' it should respond and offer the user the option to turn itself off or apologize."

However, Picard is quick to explain the distinction between human feelings and functions that operate on a similar level in a computer. "We might give the computer a set of software functions that perform basic operations similar to those functions that operate in humans, but it's not the same as feelings. A robot might still pull its hand back when it encounters a sharp object; but it doesn't feel pain like you and I do."

Picard will give an overview of her group's latest research on computers that both sense and respond to human emotions. Responding to her remarks will be Paul Root-Wolpe, senior faculty associate, department of psychiatry, University of Pennsylvania, who will focus on the ethical implications of affective technology and its relationship to understanding people.

Source: Eurekalert & others

Last reviewed: By John M. Grohol, Psy.D. on 21 Feb 2009
    Published on PsychCentral.com. All rights reserved.

 

 

He knows not his own strength that hath not met adversity.
~ Ben Johnson