Moving Away Makes Faces Seem Angrier

Summary: New research shows that our own physical movements can alter how we perceive emotions on others’ faces. In a virtual reality experiment, participants were more likely to judge a face as angry when they actively moved away from it, compared to when the face moved away from them.

The results reveal a two-way relationship between movement and emotion recognition, where avoidance behavior heightens perception of threat. These insights could help improve social interaction design in virtual communication and emotional AI systems.

Key Facts:

  • Behavior Shapes Perception: Actively avoiding a face made participants more likely to perceive anger, suggesting actions influence emotion recognition.
  • Two-Way Link: Findings highlight a reciprocal relationship between bodily movement and emotional perception.
  • Practical Implications: Could improve social realism and empathy in VR and AI-based communication.

Source: TUT

A research team from the Cognitive Neurotechnology Unit and the Visual Perception and Cognition Laboratory at Toyohashi University of Technology has found that approach–avoidance behavior in a virtual reality (VR) environment modulates how individuals recognize facial expressions.

Notably, the study demonstrated that participants were more likely to perceive a facial expression as “angry” when they actively moved away from the face stimulus than when the face moved away from them.

This shows a person walking away from an oversized face.
The study provides evidence that one’s own approach–avoidance behavior can modulate facial expression recognition. Credit: Neuroscience News

These findings contribute to a better understanding of the reciprocal relationship between perception and action in social contexts.

The study was published online on July 31, 2025, in the International Journal of Affective Engineering.

Facial expressions play a fundamental role in social communication. While it is well established that others’ expressions influence our behavior—such as approaching a smiling person or avoiding an angry one—the reverse effect, namely whether our own behavior affects how we recognize others’ expressions, has been less explored.

To address this question, the research team conducted three psychophysical experiments using VR. Participants wore a head-mounted display and observed 3D face models (avatars) under four distinct approach–avoidance conditions:

  1. Active approach: The participant approached the avatar.
  2. Active avoidance: The participant moved away from the avatar.
  3. Passive approach: The avatar approached the participant.
  4. Passive avoidance: The avatar moved away from the participant.

The facial expressions were generated by morphing between happy and angry (or fearful) expressions across seven levels. Participants were instructed to judge each expression as either “happy” or “angry” (or “happy” or “fearful”) depending on the experimental condition.

Results from Experiment 1 showed that participants were more likely to recognize the avatar’s expression as “angry” when they actively avoided the face, compared to when the avatar moved away from them.

This suggests that one’s own avoidance behavior may enhance the perception of threat in others’ facial expressions. The pattern supports the hypothesis that behavior and perception are linked in a bidirectional manner.

Yugo Kobayashi, the first author and a doctoral student at the Department of Computer Science and Engineering, commented: “In today’s communication environments such as video conferencing, opportunities for physical movement are limited. These findings suggest that face-to-face communication involving bodily action may facilitate more natural recognition of facial expressions.”

The study provides evidence that one’s own approach–avoidance behavior can modulate facial expression recognition. Future work will examine which aspects of these behaviors—such as motor intention, visual motion, or proprioceptive feedback—are critical to this modulation.

Funding:

This work was supported by JSPS KAKENHI (Grant Numbers JP21K21315, JP22K17987, JP20H05956, and JP20H04273), the Nitto Foundation, and research funding support for doctoral course students at Toyohashi University of Technology in FY2024.

Key Questions Answered:

Q: What was the main finding of the study?

A: People were more likely to perceive a face as angry when they moved away from it themselves, compared to when the face moved away from them.

Q: Why is this important for understanding emotions?

A: It shows that our physical behavior—approaching or avoiding—directly shapes how we interpret others’ emotions, revealing a feedback loop between motion and perception.

Q: How could this research be applied?

A: The findings could enhance emotional AI, telepresence, and virtual environments by integrating body-based perception cues.

About this social neuroscience research news

Author: Shino Okazaki
Source: TUT
Contact: Shino Okazaki – TUT
Image: The image is credited to Neuroscience News

Original Research: Open access.
Facial Expression Recognition is Modulated by Approach-Avoidance Behavior” by Yugo Kobayashi et al. International Journal of Affective Engineering


Abstract

Facial Expression Recognition is Modulated by Approach-Avoidance Behavior

Facial expression recognition influences approach-avoidance behaviors, but can these behaviors affect facial expression recognition?

We conducted psychophysical experiments using virtual reality to investigate this reverse causal relationship.

Participants responded to static 3D face stimuli generated by morphing expressions between happy and angry in Experiments 1 and 3.

For Experiment 2, happy-fearful morphed stimuli were employed. Participants either approached, avoided, or were approached or avoided by the face.

The results showed that participants recognized the face as angrier when they avoided it rather than when it avoided them (Experiment 1); as happy when approaching and fearful when avoiding, irrespective of who acted (Experiment 2); and as angrier when the face approached them rather than when they approached it if both parties were physically close (Experiment 3).

These findings suggest that approach-avoidance behavior influences facial expression recognition. We posit that unconscious learning rooted in biological instincts creates this connection.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.