How the Brain Processes Emotional Situations

Summary: A new study reveals how the brain responds to emotionally charged scenes, showing that the occipital temporal cortex (OTC) differentiates emotional stimuli to guide behavior.

Researchers found that the OTC processes both the type and emotional intensity of stimuli, offering insights into nuanced human reactions. This discovery helps understand how the brain supports complex behavioral choices in response to emotional stimuli. The findings could inform future research on neurological and psychiatric conditions.

Key Facts:

  1. Brain Response: The OTC differentiates emotional stimuli based on type and intensity.
  2. Guiding Behavior: This brain activity helps guide nuanced human reactions.
  3. Research Implications: Findings could aid understanding of neurological and psychiatric conditions.

Source: TCD

The ability to recognize and respond to emotionally-charged situations is essential to a species’ evolutionary success.

A new study published today in Nature Communications advances our understanding of how the brain responds to emotionally charged objects and scenes. 

The research, led by Trinity College Dublin neuroscientist Prof. Sonia Bishop and Google researcher Samy Abdel-Ghaffar while he was a PhD student in Prof. Bishop’s lab at UC Berkeley, has identified how the brain represents different categories of emotional stimuli in a way that allows for more than a simple ‘approach avoid’ dichotomy when guiding behavioural responses.  

This shows a woman and a brain.
Participants were asked to categorise the images as positive, negative or neutral and to also rate the emotional intensity of the images. Credit: Neuroscience News

The research was funded by the National Institutes of Health, USA. 

Sonia Bishop, now Chair of Psychology, in Trinity’s School of Psychology and senior author of the paper explains: “It is hugely important for all species to be able to recognise and respond appropriately to emotionally salient stimuli, whether that means not eating rotten food, running from a bear, approaching an attractive person in a bar or comforting a tearful child. 

“How the brain enables us to respond in a nuanced way to emotionally-charged situations and stimuli has long been of interest. But, little is known about the how the brain stores schemas or neural representations to support the nuanced behavioural choices we make in response to emotional natural stimuli.

“Neuroscience studies of motivated behaviour often focus on simple approach or avoidance behaviours – such as lever pressing for food or changing locations to avoid a shock. However, when faced with natural emotional stimuli, humans don’t simply choose between ‘approach’ or ‘avoid’. Rather they select from a complex range of suitable responses.

“So, for example, our ‘avoid’ response to a large bear (leave the area ASAP) is different to our ‘avoid’ response to a weak, diseased, animal (don’t get too close). Similarly our ‘approach’ response to the positive stimuli of a potential mate differs to our ‘approach’ reaction to a cute baby. 

“Our research reveals that the occipital temporal cortex is tuned not only to different categories of stimuli but it also breaks down these categories based on their emotional characteristics in a way that is well suited to guide selection between alternate behaviours.”

The research team from Trinity College Dublin, University of California Berkeley, University of Texas at Austin, Google and University of Nevada Reno, analysed the brain activity of a small group of volunteers when viewing over 1,500 images depicting natural emotional scenes such as a couple hugging, an injured person in a hospital bed, a luxurious home, and an aggressive dog. 

Participants were asked to categorise the images as positive, negative or neutral and to also rate the emotional intensity of the images. A second group of participants picked the behavioural responses that best matched each scene. 

Using cutting-edge modelling of brain activity divided into tiny cubes (of under 3mm3) the study discovered that the occipital temporal cortex (OTC),  a region at the back of the brain, is tuned to represent both the type of stimulus (single human, couple, crowd, reptile, mammal, food, object, building, landscape etc.) and the emotional characteristics of the stimulus – whether it’s negative, positive or neutral and also whether it’s high or low in emotional intensity. 

Machine learning showed that these stable tuning patterns were more efficient in predicting the behaviours matched to the images by the second group of participants than could be achieved by applying machine learning directly to image features — suggesting that the OTC efficiently extracts and represents the information needed to guide behaviour. 

Samy Abdel-Ghaffar, Google, commented: “For this project we used Voxel-Wise Modeling, which combines machine learning methods, large datasets and encoding models, to give us a much more fine-grained understanding of what each part of the OTC represents than traditional neuroimaging methods.

“This approach let us explore the intertwined representation of categorical and emotional scene features, and opened the door to novel understanding of how OTC representations predict behaviour.”    

Prof. Bishop added: “These findings expand our knowledge of how the human brain represents emotional natural stimuli. In addition, the paradigm used does not involve a complex task making this approach suitable in the future, for example, to further understanding of how individuals with  a range of neurological and psychiatric conditions differ in processing emotional natural stimuli.”

More about the study method:

The team used a novel large dataset of 1,620 emotional natural images and conducted functional magnetic resonance imaging with adult human volunteers, acquiring  over 3,800 3D pictures of brain activity while participants viewed these images. Participants judged these images on valence (positive, negative or neutral) and arousal (or emotional intensity). 

Modelling this data using small 2.4×2.4x3mm chunks or ‘voxels’ of brain activity, the researchers found that regions of occipital temporal cortex, in the back of the brain, showed differential representation of both stimulus semantic category and affective value. For example, positive high arousal faces were represented in slightly different regions to negative high arousal faces and neutral low arousal faces.

Furthermore, when a completely new set of participants were asked to select behaviours that went with each image, the top dimensions of this neural coding representational ‘space’  better predicted the behaviours selected than the top dimensions based directly on image features (for example is the stimulus animate? positive?).

This suggests that the brain chooses which information is important or not important to represent and hold stable representations of sub-categories of animate and inanimate stimuli that integrate affective information and are optimally organised to support the selection of behaviours to different types of emotional natural stimuli.

About this neuroscience and emotion research news

Author: Fiona Tyrrell
Source: TCD
Contact: Fiona Tyrrell – TCD
Image: The image is credited to Neuroscience News

Original Research: Open access.
Occipital-temporal cortical tuning to semantic and affective features of natural images predicts associated behavioral responses” by Samy Abdel-Ghaffar et al. Nature Communications


Abstract

Occipital-temporal cortical tuning to semantic and affective features of natural images predicts associated behavioral responses

In everyday life, people need to respond appropriately to many types of emotional stimuli.

Here, we investigate whether human occipital-temporal cortex (OTC) shows co-representation of the semantic category and affective content of visual stimuli. We also explore whether OTC transformation of semantic and affective features extracts information of value for guiding behavior.

Participants viewed 1620 emotional natural images while functional magnetic resonance imaging data were acquired. Using voxel-wise modeling we show widespread tuning to semantic and affective image features across OTC.

The top three principal components underlying OTC voxel-wise responses to image features encoded stimulus animacy, stimulus arousal and interactions of animacy with stimulus valence and arousal.

At low to moderate dimensionality, OTC tuning patterns predicted behavioral responses linked to each image better than regressors directly based on image features.

This is consistent with OTC representing stimulus semantic category and affective content in a manner suited to guiding behavior.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.