advertisement
Home » News » Research Uses Artificial Intelligence to Measure Human Emotions

Research Uses Artificial Intelligence to Measure Human Emotions

New research presented virtually at the Cognitive Neuroscience Society (CNS) annual meeting shows how data-driven computational methods are being used to explain the most basic human trait – emotions. Investigators believe their findings will overturn old ideas about the structure of emotions across humanity.

Scientists are applying computing power to understand everything from how we generate spontaneous emotions during mind-wandering to how we decode facial expressions across cultures.

Investigators believe the findings are important in characterizing how emotions contribute to well-being, the neurobiology of psychiatric disorders, and even how to make more effective social robots.

“Artificial intelligence (AI) enables scientists to study emotions in ways that were previously thought to be impossible, which is leading to discoveries that transform how we think emotions are generated from biological signals,” said Dr. Kevin LaBar of Duke University.

Six core human emotions — fear, anger, disgust, sadness, happiness and surprise — have been considered as universal in human psychology for decades. Yet despite the societal prevalence of this idea, experts contend the scientific consensus actually shows that these emotions are far from universal.

In particular, there is a significant gap in facial recognition of these emotions across cultures in particular for people from East Asia, said Dr. Rachael Jack, a researcher at the University of Glasgow.

Jack has been working to understand what she calls the “language of the face;” how individual face movements combine in different ways to create meaningful facial expressions (like how letters combine to create words).

“I think of this a bit like trying to crack hieroglyphics or an unknown ancient language,” Jack said. “We know so much about spoken and written language, even hundreds of ancient languages, but we have comparatively little formal knowledge of the non-verbal communication systems we use every day and which are so critical to all human societies.”

In new work, Jack and her team have created a novel data-driven method to create dynamic models of these face movements, like a recipe book of facial expressions of emotions. Her team is now transferring these models to digital agents, such as social robots and virtual humans, so that they can generate facial expressions that are socially nuanced and culturally sensitive.

From their research they have created a novel face movement generator which can randomly select a subset of individual face movements, such as eyebrow raiser, nose wrinkler, or lip stretcher, and randomly activate the intensity and timing of each.

These randomly activated face movements then combine to create a facial animation. Study participants from different cultures then categorize the facial animation according to the six classic emotions, or they can select “other” if they do not perceive any of these emotions.

After many such trials, the researchers build a statistical relationship between the face movements presented on each trial and the participants’ responses, which produces a mathematical model.

“In contrast to traditional theory-driven approaches where experimenters took a hypothesized set of facial expressions and showed them to participants across the world, we have added a psychophysical approach,” Jack said.

“It is more data-driven and more agnostic in sampling and testing facial expressions and, critically, uses the subjective perceptions of cultural participants to understand what face movements drive their perception of a given emotion, for example, ‘he is happy.'”

These studies have condensed the six commonly thought of universal facial expressions of emotions to only four cross-cultural expressions. “There are substantial cultural differences in facial expressions that can hinder cross-cultural communication,” Jack said. “We often, but not always, find that East Asian facial expressions have more expressive eyes than Western facial expressions, which tend to have more expressive mouths — just like Eastern versus Western emoticons!”

She adds that there are also cultural commonalities that can be used to support accurate cross-cultural communication of specific messages; for example, facial expressions of happy, interested, and bored are similar across Eastern and Western cultures and can be recognized across cultures easily.

Jack and her team are now using their models to enhance the social signaling capabilities of robots and other digital agents that can be used globally. “We’re very excited to transfer our facial expression models to a range of digital agents and to see the dramatic improvement in performance,” she says.

Understanding how the subjective experience of emotion is mediated in the brain is the holy grail of affective neuroscience, said LaBar of Duke. “It is a hard problem, and there has been little progress to date.” In his lab, LaBar and colleagues are working to understand the emotions that emerge while the brain is mind-wandering at rest.

“Whether triggered by internal thoughts or memories, these ‘stream-of-consciousness’ emotions are the targets of rumination and worry that can lead to prolonged mood states, and can bias memory and decision-making,” he said.

Until recently, researchers have been unable to decode these emotions from resting-state signals of brain function. Now, LaBar’s team has been able to apply machine learning tools to derive neuroimaging markers of a small set of emotions like fear, anger, and surprise. Moreover, the researchers have modeled how these emotions spontaneously emerge in the brain while subjects are resting in an MRI scanner.

The core of the work has been training a machine learning algorithm to differentiate patterns of brain activity that separate emotions from one another. The researchers present a pattern classifier algorithm with a training data set from a group of participants who were presented with music and movie clips that induced specific emotions.

Using feedback, the algorithm learns to weigh the inputs coming from different regions of the brain to optimize the signaling of each emotion. The researchers then test how well the classifier can predict the elicited emotions in a new sample of participants using the set of brain weights it generated from the test sample.

“Once the emotion-specific brain patterns are validated across subjects in this way, we look for evidence that these patterns emerge spontaneously in participants who are merely lying at rest in the scanner,” Labar said.

“We can then determine whether the pattern classifier accurately predicts the emotions that people spontaneously report in the scanner, and identify individual differences.”

Source: Cognitive Neuroscience Society/EurekAlert

Research Uses Artificial Intelligence to Measure Human Emotions

Rick Nauert PhD

Rick Nauert, PhDDr. Rick Nauert has over 25 years experience in clinical, administrative and academic healthcare. He is currently an associate professor for Rocky Mountain University of Health Professionals doctoral program in health promotion and wellness. Dr. Nauert began his career as a clinical physical therapist and served as a regional manager for a publicly traded multidisciplinary rehabilitation agency for 12 years. He has masters degrees in health-fitness management and healthcare administration and a doctoral degree from The University of Texas at Austin focused on health care informatics, health administration, health education and health policy. His research efforts included the area of telehealth with a specialty in disease management.

APA Reference
Nauert PhD, R. (2020). Research Uses Artificial Intelligence to Measure Human Emotions. Psych Central. Retrieved on August 6, 2020, from https://psychcentral.com/news/2020/05/07/research-uses-artificial-intelligence-to-measure-human-emotions/156281.html
Scientifically Reviewed
Last updated: 7 May 2020 (Originally: 7 May 2020)
Last reviewed: By a member of our scientific advisory board on 7 May 2020
Published on Psych Central.com. All rights reserved.