Summary: We’ve long known that the brain uses “GPS-like” circuits to navigate physical space, but a new study suggests it does the same for our feelings. Researchers discovered that the hippocampus and the ventromedial prefrontal cortex (vmPFC) work together to create a mental map of emotions, charting them along two main axes: valence (how pleasant or unpleasant a feeling is) and arousal (the intensity of the bodily reaction).
By combining fMRI data with AI neural networks, the team showed that the brain organizes emotions into a structured hierarchy—ranging from broad “good/bad” categories to highly granular, nuanced feelings. This finding could revolutionize how we treat depression and anxiety, where this internal emotional map often becomes “compressed” and less defined.
Key Facts
- The Emotional GPS: The hippocampus-prefrontal circuit, traditionally linked to memory and spatial navigation, is the primary driver for “mapping” emotion concepts.
- Granular vs. Global: The interior hippocampus handles broad emotional categories (e.g., “this is good”), while the posterior region manages finer-grained, nuanced concepts.
- Relational Tracking: While the hippocampus stores the “nodes” or categories of emotion, the vmPFC tracks the relationships between them—predicting how we might transition from one feeling to another.
- Clinical Connection: People with high “emotional granularity” (the ability to differentiate subtle feelings) tend to have better mental health outcomes, whereas those with depression often show a “compressed” map with fewer distinctions between emotions.
- AI Validation: Researchers used an artificial neural network called the Tolman-Eichenbaum Machine (TEM) to simulate “virtual robots” walking through an abstract graph of emotions, successfully replicating human brain patterns.
Source: Emory University
It is well established in psychology that humans conceptualize emotions by features known as valence (the degree of pleasantness or unpleasantness) and arousal (the intensity of bodily reactions, such as rapid breathing or a racing heart).
If you think of “pleasantness” as longitude and “bodily reaction” as latitude, you can imagine a “mental map,” with nodes that “chart” knowledge of emotion.
The neural mechanisms giving rise to this configuration, however, have remained unclear.
Now, a new study reveals that hippocampal-prefrontal circuits — neural structures implicated in forming other types of cognitive maps — could support the mental mapping of emotion.
Nature Communications published the research by neuroscientists at Emory University. The results showed how the hippocampus represents emotion concepts in a structured hierarchy of “nodes” of pleasantness and bodily reaction, while the ventromedial prefrontal cortex more accurately tracks relationships between these different nodes, or how they are distributed on the mental map.
Pinpointing the neural mechanisms that produce such map-like representations may ultimately help in the treatment for some mental illnesses, says Philip Kragel, senior author of the research and Emory professor of psychology.
“Research has shown that individuals with depression and anxiety represent emotions in a more compressed, less differentiated way,” he explains. “And that people who represent emotion with more granularity and differentiation tend to have better health outcomes.”
The current paper combined human brain imaging data, pattern recognition and simulations using AI neural networks.
“People’s emotional experiences are subjective,” says Yumeng Ma, first author of the paper and an Emory PhD student of psychology. “We’re using technology to understand the mechanisms underlying emotions in an objective, scientific way.”
Developing new approaches
“Emotions are central to human experience, they are not simply reactions to things,” Kragel says. “They are important to our success and to our well-being. They help us to communicate better, learn from our experiences, and empathize with others.”
And yet, he adds, emotions have been notoriously difficult to study scientifically.
Kragel is a leader in developing computational methods to study the nature of emotions. His Emotion Cognition and Computation Lab (ECCO Lab) works at the intersection of psychology, cognitive neuroscience and machine learning.
AI neural networks, modeled on the human brain, are one tool used by the lab.
Similarly to the human brain, an artificial neural network must boil down complex data into its essence, a process known as “embedding,” so that vast amounts of knowledge may be stored in an organized and efficient manner.
“For the current paper, we wanted to probe how the human brain compresses emotion experiences,” Kragel says. “How do we embed these very complicated events? What are the relevant neural signals?”
Combining human neuroimaging and pattern recognition
The researchers began by tapping the multimodal dataset Emo-FiLM (Emotion Research Using Films and fMRI), a component of OpenNeuro, a free and open platform for validating and sharing neuroscience data.
The Emo-FilM dataset includes ratings of various emotions by participants as they watch short, emotionally evocative film clips. These human ratings of emotion experience and the corresponding brain activity scans can be examined in relation to one another to reduce the gap between theory in psychology and empirical neuroscience. The dataset is tuned to understand underlying emotion processes rather than individual differences.
The researchers developed predictive models to analyze this dataset and found, as expected, that self-report measures of emotional experience could be decoded from fMRI patterns of hippocampal-prefrontal activity.
The hippocampus is a seahorse-shaped structure in the temporal lobe that helps organize experiences into memories by linking information from across the brain. The ventromedial prefrontal cortex, or vmPFC, is a brain region in the frontal lobe involved in weighing information about goals, social cues and outcomes, helping people make decisions and evaluate risk and reward.
Analyzing the outputs of predictive models revealed these brain systems contained information consistent with a map-like representation.
“For example,” Ma explains, “occurrences of anger and fear are often closer together compared to those of happiness and excitement.”
The researchers tested the model’s ability to predict both emotion categories and the relations between them. The results showed more information about emotion categories in the hippocampus and more relational information in the vmPFC.
Tapping an artificial neural network
They further probed their framework using an artificial neural network known as the Tolman-Eichenbaum Machine, or TEM, which serves as a computational model of relational memory in the brain.
The researchers first created an artificial environment, represented as an abstract graph, based on emotion category ratings from the film-viewing data. TEM artificial agents, or virtual robots, were exposed to this environment so they could learn how emotion concepts relate to one another.
After this training, trajectories of the artificial agents were plotted as they “walked” through the environment and made their own predictions about what they would experience if they stayed put or moved up, down, to the right or to the left along the graph.
“The main takeaway,” Ma says, “is we found that the hierarchy of emotion categories is represented more broadly — for example, this is good, that is bad — in the interior part of the hippocampus. And in the posterior region, the representations are more granular, finer-grained concepts.”
The results also showed that the vmPFC appears to track long-term transitions for broad, rather than finer-grained emotion concepts.
Foundational work
The findings offer a neurocomputational explanation of how humans organize abstract emotion knowledge in a generalized, normative way.
The researchers hope to build on their findings by studying how this mental map may differ among those with mental health issues and across different cultures.
They also want to explore how this mental map for emotions develops over time.
“These are open questions,” Kragel says, “Are you born with the ability to form broad categories of emotion, such as good or bad, and then you gradually learn where to add more nuanced nodes on the graph? Or maybe you’re born with the ability to learn general relational structures. Do the emotions come first? Or is it the other way around?”
Key Questions Answered:
A: Think of it like a coordinate system. Your brain uses “pleasantness” as longitude and “intensity” as latitude. On this map, “anger” and “fear” are neighbors because they both feel intense and unpleasant, while “happiness” and “excitement” are on a completely different continent.
A: It’s the difference between seeing a map of the world versus a map of just your neighborhood. If your map is compressed (common in depression), you might only feel “bad” without being able to distinguish between being “lonely,” “frustrated,” or “tired.” Identifying the exact emotion (granularity) allows your brain to find better solutions for how to fix it.
A: Yes! Scientists believe that labeling your emotions with more detail—a practice called “affect labeling”—can help expand the nodes on your mental map. By teaching your hippocampus to recognize finer distinctions between feelings, you can improve your emotional regulation and overall well-being.
Editorial Notes:
- This article was edited by a Neuroscience News editor.
- Journal paper reviewed in full.
- Additional context added by our staff.
About this emotion and neuroscience research news
Author: Carol Clark
Source: Emory University
Contact: Carol Clark – Emory University
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Map-like representations of emotion knowledge in hippocampal-prefrontal systems” by Yumeng Ma & Philip A. Kragel. Nature Communications
DOI:10.1038/s41467-025-68240-z
Abstract
Map-like representations of emotion knowledge in hippocampal-prefrontal systems
Emotional experiences involve more than bodily reactions and momentary feelings—they depend on knowledge about the world that spans contexts and time. Although it is well established that individuals conceptualize emotions using a low-dimensional space organized by valence and arousal, the neural mechanisms giving rise to this configuration remain unclear.
Here, we examine whether hippocampal-prefrontal circuits—neural structures implicated in forming cognitive maps—also support the structural abstraction of emotional experiences.
Using functional MRI data collected as participants viewed emotionally evocative film clips, we find that hippocampal activity represents emotion concepts in a structured hierarchy, whereas ventromedial prefrontal cortex more accurately tracks locations in a two-dimensional affective space.
Computational modeling reveals that hippocampal-prefrontal responses to films can be predicted based on the statistical regularities of emotion transitions across multiple temporal scales.
These findings demonstrate that hippocampal-prefrontal systems represent emotion concepts in a map-like way at multiple levels of abstraction, offering insight into how the brain organizes emotion knowledge.

