Summary: Researchers say sex influences emotional interpretation of faces and voices. Faces and voices are judged to be male when they are angry and female when they are happy.
Source: University of Essex
Faces and voices are more likely to be judged as male when they are angry, and as female when they are happy, new University of Essex research has revealed.
The study led by Dr Sebastian Korb found that how we understand the emotional expression of a face or voice is heavily influenced by perceived sex, and vice versa.
The paper published in the journal Emotion reveals that both men and women subconsciously make the same mistakes.
Dr Korb, from the Department of Psychology, hopes the research will be expanded and could help make us more aware of our built-in biases.
He said: “This study shows how important it is not to rely too much on your first impressions, as they can easily be wrong.
“Next time you find yourself attributing happiness or sadness to a woman be aware of your bias and possible misinterpretation.
“Interestingly there wasn’t a gender divide in the way the perceived sex of a face affected emotional judgements – but women were slightly more sensitive to subtle changes in emotion overall.”
The research used 121 avatar faces and 121 human voices created by modifying the emotional expression in degrees from happy to angry, and the sex on a sliding scale from male to female.
A total of 256 participants in three studies were shown the mock-ups or played the voices and asked to judge emotions and whether someone was male or female.
When comparing the size of the effects, it was found for both faces and voices that emotion influenced the perception of sex more than the other way around.
It is thought this may be due to an unconscious activation of the amygdala – an important emotion centre in the brain.
This almond-shaped cluster of neurons located deep in the brain allows us to rapidly detect and react to threats, such as an angry attacker but is not involved in determining a person’s sex.
It is also speculated that being biased to perceive males as angry is an evolutionary advantage as it prepares for a fight or flight response.
About this psychology and emotion research news
Author: Ben Hall
Source: University of Essex
Contact: Ben Hall – University of Essex
Image: The image is in the public domain
Original Research: Closed access.
“EmoSex: Emotion prevails over sex in implicit judgments of faces and voices” by Sebastian Korb et al. Emotion
EmoSex: Emotion prevails over sex in implicit judgments of faces and voices
Appraisals can be influenced by cultural beliefs and stereotypes. In line with this, past research has shown that judgments about the emotional expression of a face are influenced by the face’s sex, and vice versa that judgments about the sex of a person somewhat depend on the person’s facial expression. For example, participants associate anger with male faces, and female faces with happiness or sadness.
However, the strength and the bidirectionality of these effects remain debated. Moreover, the interplay of a stimulus’ emotion and sex remains mostly unknown in the auditory domain.
To investigate these questions, we created a novel stimulus set of 121 avatar faces and 121 human voices (available at https://bit.ly/2JkXrpy) with matched, fine-scale changes along the emotional (happy to angry) and sexual (male to female) dimensions. In a first experiment (N = 76), we found clear evidence for the mutual influence of facial emotion and sex cues on ratings, and moreover for larger implicit (task-irrelevant) effects of stimulus’ emotion than of sex.
These findings were replicated and extended in two preregistered studies—one laboratory categorization study using the same face stimuli (N = 108; https://osf.io/ve9an), and one online study with vocalizations (N = 72; https://osf.io/vhc9g).
Overall, results show that the associations of maleness-anger and femaleness-happiness exist across sensory modalities, and suggest that emotions expressed in the face and voice cannot be entirely disregarded, even when attention is mainly focused on determining stimulus’ sex.
We discuss the relevance of these findings for cognitive and neural models of face and voice processing.