Summary: A new fMRI study reveals that our brains encode both what others intend to express emotionally and how we consciously infer their feelings—two distinct processes. Researchers trained machine-learning models on brain activity to separately predict the speaker’s self-reported emotions and the observer’s inferences.
They found that even when people misjudged someone’s emotions, their brain still carried a latent signature of the speaker’s intended feeling. Alignment between these two brain patterns predicted greater empathic accuracy, offering insights into how social understanding works and why it sometimes fails.
Key Facts
- Dual Neural Signatures: Observers’ brains hold separate patterns for the target’s intent and their own inference.
- Latent Recognition: The brain encodes the speaker’s intended emotion even when conscious judgment is wrong.
- Empathy Correlates: Greater alignment between intent and inference patterns predicts higher empathic accuracy.
Source: Neuroscience News
New fMRI research reveals the brain harbors hidden knowledge of others’ emotions, shedding light on how we read—and misread—social signals.
Humans are remarkably good at “reading the room.” With little effort, we watch someone’s face, hear their voice, and quickly form an impression of how they’re feeling.
But where in the brain does this ability come from? And why do we sometimes get it wrong?
A new study of 100 people scanned with fMRI while they judged the emotions of others suggests the brain encodes not only what we think someone is feeling, but also what they actually intend to convey—even when our conscious inference misses the mark.
The findings, published this week, reveal two distinct neural patterns in the observer’s brain: one representing the target’s actual self-reported emotional intensity (their intent), and another representing the observer’s subjective interpretation (their inference).
Intriguingly, the “intent” pattern was detectable even when the observer’s inference was inaccurate. And when the two patterns aligned more closely in the brain, observers were better at correctly inferring what the target was feeling—essentially, greater empathic accuracy.
The research builds on decades of work showing that social connection and empathy are critical to mental and physical health. Poor social signal processing has been linked to conditions such as autism, schizophrenia, and social anxiety.
Understanding how the brain represents and transforms socioemotional signals could help explain why some people struggle to connect—and how to improve interventions for social disorders.
Seeing What’s Really There
The study’s participants watched videos of people describing emotional life events and rated, moment by moment, how intense they thought the person’s feelings were. Meanwhile, their brains were scanned.
The storytellers (or “targets”) had themselves already rated their own emotional intensity while recording the videos—providing a rare “ground truth” of intent to compare against.
Using machine learning, the team trained two separate models on the observers’ brain activity: one to predict the target’s self-reported intent, and one to predict the observer’s own inference.
Both models performed significantly above chance and, importantly, revealed distinct but overlapping brain networks. The intent pattern drew heavily on regions such as the precuneus, angular gyrus, and anterior insula—areas linked to self-referential and social cognition.
The inference pattern relied more on mentalizing and somatosensory areas, suggesting that observers partly simulate the target’s state using their own bodily and emotional memories.
Interestingly, even when observers misread the target’s feelings, the intent signal was still present in their brain activity, hinting that our brains unconsciously register more about others’ emotions than we consciously report.
When Minds Align
A key insight emerged when the researchers tested how closely the two patterns—intent and inference—aligned within each participant. When alignment was high, observers were more likely to accurately perceive the target’s emotional intensity.
When the patterns diverged, accuracy suffered. This suggests that the brain’s latent recognition of the target’s intent is a resource observers can tap into—but don’t always fully utilize.
The findings also support the idea that empathy involves both automatic and deliberate processes. The automatic component seems to pick up on the intended meaning of a signal through well-tuned social schemas, while the deliberate inference incorporates autobiographical memories, biases, and expectations. Together, these processes shape our judgments—for better or worse.
Why It Matters
These results open up new avenues for studying empathy and social cognition in more naturalistic, real-world settings. Traditional emotion research often relies on static photos or staged scenarios, but this study used dynamic, authentic stories—capturing the complexity of human communication more realistically.
The distinction between intent and inference may also have clinical implications. In conditions like autism or schizophrenia, where social functioning is impaired, interventions could aim to strengthen the connection between latent intent recognition and conscious inference, helping individuals better translate what they unconsciously perceive into accurate social judgments.
A Window Into Empathy
This research adds to a growing understanding that the brain is constantly processing social information, much of it below conscious awareness. The next challenge is to figure out how to help people access and trust those latent signals more effectively—improving social connection and reducing loneliness.
As the authors note, effective social signaling and accurate inference are foundational to human relationships. By clarifying how these processes unfold in the brain, we move closer to understanding—and perhaps enhancing—the empathy that keeps our communities connected.
About this neuroscience and empathy research news
Author: Neuroscience News Communications
Source: Neuroscience News
Contact: Neuroscience News Communications – Neuroscience News
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Neural signatures of emotional intent and inference align during social consensus” by Marianne C. Reddan et al. Nature Communications
Abstract
Neural signatures of emotional intent and inference align during social consensus
Humans effortlessly transform dynamic social signals into inferences about other people’s internal states.
Here we investigate the neural basis of this process by collecting fMRI data from 100 participants as they rate the emotional intensity of people (targets) describing significant life events.
Targets provide self-ratings on the same scale. We then train and validate two unique multivariate models of observer brain activity.
The first predicts the target’s self-ratings (i.e., intent), and the second predicts observer inferences.
Correspondence between the intent and inference models’ predictions on novel test data increases when observers are more empathically accurate.
However, even when observers make inaccurate inferences, the target’s intent can still be predicted from observer brain activity.
These findings suggest that an observer’s brain contains latent representations of other people’s socioemotional intensity, and that fMRI models of intent and inference can be combined to predict empathic accuracy.