Even the Best Models for Reading Facial Expressions May Be Partly Subjective

Summary: Even the best models for recognizing facial emotions fall short of real human judgment. Additionally, individual differences mean different people read different emotions from the same face, making it harder to ascertain exactly which facial movements are systematically linked to different emotional states.

Source: University of Glasgow

Does someone look angry or sad? You can probably offer an answer to that question based on the information you can see just by looking at their face. That’s because facial expressions—or a combination of different small facial movements—can be read by other humans to help understand what a person might be feeling at that exact moment.

Since Darwin’s seminal work on the evolutionary origins of facial expressions of emotion, scientists have been trying to find out which specific combinations of facial movements best represent our six basic emotions: happy, surprise, fear, disgust, anger and sadness.

So far, researchers have offered a range of theories—or models—to define which facial movements best match each emotion, but until now no one has been able to show which one is most accurate.

Now, a new study by a team of European researchers led by the University of Glasgow and University of Amsterdam has begun to answer that question.

The new study, which is published in Science Advances, shows that even the best models for predicting emotions from facial expressions fall short of the judgment of real human participants.

Moreover, different humans themselves may read different emotions from the same facial expression, making it even harder to pinpoint exactly which facial movements are systematically linked with certain emotions.

The research indicates that our brains take in more information than just facial movements when understanding emotions in another person and, importantly, that personal characteristics (such as one’s gender and culture) may also influence this process.

The research team asked more than 120 participants, 60 Western and 60 East-Asian, to categorize a series of facial animation videos as having one of the six classic basic emotions, if they perceived one to be there. The researchers then compared these human answers to the predictions of the existing models, to see how accurate they were.

This shows a statue with a blank face
Moreover, different humans themselves may read different emotions from the same facial expression, making it even harder to pinpoint exactly which facial movements are systematically linked with certain emotions. Image is in the public domain

They found that even the best models were far from perfect, highlighting that facial characteristics, such as gender or age, may also be important sources of information beyond simple facial movements.

Researchers also found that existing models perform better for Western than for East Asian participants, highlighting the persistent bias towards Western representations of emotional facial expressions in current theories in the field.

Lead author of the study, Dr. Lukas Snoek from the University of Glasgow’s School of Psychology & Neuroscience, said, “The question of how we read emotions from faces is important, and sometimes controversial. Since Darwin, there have been hypotheses about which facial movements correspond to a specific emotion, for example that a frown in combination with compressed lips signals ‘anger.’

“In our research we were able to show that not all humans will perceive the same emotion from the same set of facial movements. Our results indicate that these individual differences are in part due to one’s cultural background and we show that incorporating culture in emotion perception models improves their performance substantially.”

About this facial expression and emotion research news

Author: Press Office
Source: University of Glasgow
Contact: Press Office – University of Glasgow
Image: The image is in the public domain

Original Research: Open access.
Testing, explaining, and exploring models of facial expressions of emotions” by Lukas Snoek et al. Science Advances


Abstract

Testing, explaining, and exploring models of facial expressions of emotions

Models are the hallmark of mature scientific inquiry. In psychology, this maturity has been reached in a pervasive question—what models best represent facial expressions of emotion?

Several hypotheses propose different combinations of facial movements [action units (AUs)] as best representing the six basic emotions and four conversational signals across cultures.

We developed a new framework to formalize such hypotheses as predictive models, compare their ability to predict human emotion categorizations in Western and East Asian cultures, explain the causal role of individual AUs, and explore updated, culture-accented models that improve performance by reducing a prevalent Western bias.

Our predictive models also provide a noise ceiling to inform the explanatory power and limitations of different factors (e.g., AUs and individual differences).

Thus, our framework provides a new approach to test models of social signals, explain their predictive power, and explore their optimization, with direct implications for theory development.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.