Summary: Findings shed new light on the role the prefrontal cortex plays in sensory processing and perception.
Source: University of Toronto
An image of a beautiful beach conjures up certain sensations – one can imagine the warmth of the sun as it caresses the skin, and the sound of the water as waves break on the shore. But how is it that the human brain produces these impressions even when an individual isn’t actually standing on a beach, basking in the sun’s rays and listening to the sound of the waves?
Scientists at the University of Toronto (U of T) exploring this mystery found that the brain’s prefrontal cortex – a region known primarily for its role in regulating behaviour, impulse inhibition, and cognitive flexibility – produces such general sensations based on information provided by various senses. The findings provide new insights into the poorly understood role of the prefrontal cortex in human perception.
Using a combination of photographs, sounds and even heated massage stones, the researchers investigated patterns of neural activity in the prefrontal cortex as well as the other regions of the brain known to be responsible for processing stimulation from all the senses and found significant similarities.
“Whether an individual was directly exposed to warmth, for example, or simply looking at a picture of a sunny scene, we saw the same pattern of neural activity in the prefrontal cortex,” said Dirk Bernhardt-Walther, professor in the Department of Psychology in the Faculty of Arts & Science at the U of T, and coauthor of a study published in the Journal of Neuroscience describing the findings. “The results suggest that the prefrontal cortex generalizes perceptual experiences that originate from different senses.”
To understand how the human brain processes the torrent of information from the environment, researchers often study the senses in isolation, with much of prior work focused on the visual system. Bernhardt-Walther says that while such work is illuminating and important, it is equally important to find out how the brain integrates information from the different senses, and how it uses the information in a task-directed manner. “Understanding the basics of these capabilities provides the foundation for research of disorders of perception,” he said.
Using functional magnetic resonance imaging (fMRI) technology to capture brain activity the researchers conducted two experiments with the same participants, based on knowing how regions of the brain respond differently depending on the intensity of stimulation.
In the first, the participants viewed a series of images of various scenes – including beaches, city streets, forests, and train stations – and were asked to judge if the scenes were warm or cold and noisy or quiet. Throughout, neural activity across several regions of the brain was tracked.
In the second experiment, participants were first handed a series of massage stones that were either heated to 45℃ or cooled to 9℃, and later exposed to sounds both quiet and noisy – such as birds, people, and waves at a beach.
“When we compared the patterns of activity in the prefrontal cortex, we could determine temperature both from the stone experiment and from the experiment with pictures as the neural activity patterns for temperature were so consistent between the two experiments,” said lead author of the study Yaelan Jung, who recently completed her PhD at U of T working with Bernhardt-Walther and is now a postdoctoral researcher at Emory University.
“We could successfully determine whether a participant was holding a warm or a cold stone from patterns of brain activity in the somatosensory cortex, which is the part of the brain that receives and processes sensory information from the entire body, while brain activity in the visual cortex told us if they were looking at an image of a warm or cold scene,” said Jung.
The patterns were so compatible that a decoder trained on prefrontal brain activity from the stone experiment was able to predict the temperature of a scene depicted in an image as it was viewed.
“It tells us about the relationship between someone feeling warmth by looking at a picture versus actually touching a warm object,” Jung said.
Similarly, the researchers could decode noisy versus quiet sounds from the brain’s auditory cortex and pictures of noisy versus quiet scenes from the visual cortex.
“Overall, the neural activity patterns in the prefrontal cortex produced by participants viewing the images were the same as those triggered by actual experience of temperature and noise level,” said Jung.
The researchers suggest the findings may open a new avenue to study how the brain manages to process and represent complex real-world attributes that span multiple senses, even without directly experiencing them.
“In understanding how the human brain integrates information from different senses into higher-level concepts, we may be able to pinpoint the causes of specific inabilities to recognize particular kinds of objects or concepts,” said Bernhardt-Walther.
“Our results might help people with limitations in one sensory modality to compensate with another and reach the same or very similar conceptual representations in their prefrontal cortex, which is essential for making decisions about their environment.”
Funding: Support for the research was provided by the Social Sciences and Humanities Research Council of Canada and the Natural Sciences and Engineering Research Council of Canada.
Neural Representations in the Prefrontal Cortex Are Task Dependent for Scene Attributes But Not for Scene Categories
Natural scenes deliver rich sensory information about the world. Decades of research has shown that the scene-selective network in the visual cortex represents various aspects of scenes. However, less is known about how such complex scene information is processed beyond the visual cortex, such as in the prefrontal cortex. It is also unknown how task context impacts the process of scene perception, modulating which scene content is represented in the brain.
In this study, we investigate these questions using scene images from four natural scene categories, which also depict two types of scene attributes, temperature (warm or cold), and sound level (noisy or quiet). A group of healthy human subjects from both sexes participated in the present study using fMRI. In the study, participants viewed scene images under two different task conditions: temperature judgment and sound-level judgment.
We analyzed how these scene attributes and categories are represented across the brain under these task conditions. Our findings show that scene attributes (temperature and sound level) are only represented in the brain when they are task relevant. However, scene categories are represented in the brain, in both the parahippocampal place area and the prefrontal cortex, regardless of task context.
These findings suggest that the prefrontal cortex selectively represents scene content according to task demands, but this task selectivity depends on the types of scene content: task modulates neural representations of scene attributes but not of scene categories.
Research has shown that visual scene information is processed in scene-selective regions in the occipital and temporal cortices. Here, we ask how scene content is processed and represented beyond the visual brain, in the prefrontal cortex (PFC).
We show that both scene categories and scene attributes are represented in PFC, with interesting differences in task dependency: scene attributes are only represented in PFC when they are task relevant, but scene categories are represented in PFC regardless of the task context.
Together, our work shows that scene information is processed beyond the visual cortex, and scene representation in PFC reflects how adaptively our minds extract relevant information from a scene.