Summary: Researchers have identified specific, key brain regions that help us link different views of our surroundings.
Neuroscientists identify brain regions key to linking different views of our surroundings.
When asked to visualize your childhood home, you can probably picture not only the house you lived in, but also the buildings next door and across the street. MIT neuroscientists have now identified two brain regions that are involved in creating these panoramic memories.
These brain regions help us to merge fleeting views of our surroundings into a seamless, 360-degree panorama, the researchers say.
“Our understanding of our environment is largely shaped by our memory for what’s currently out of sight,” says Caroline Robertson, a postdoc at MIT’s McGovern Institute for Brain Research and a junior fellow of the Harvard Society of Fellows. “What we were looking for are hubs in the brain where your memories for the panoramic environment are integrated with your current field of view.”
Robertson is the lead author of the study, which appears in the Sept. 8 issue of the journal Current Biology. Nancy Kanwisher, the Walter A. Rosenblith Professor of Brain and Cognitive Sciences and a member of the McGovern Institute, is the paper’s lead author.
As we look at a scene, visual information flows from our retinas into the brain, which has regions that are responsible for processing different elements of what we see, such as faces or objects. The MIT team suspected that areas involved in processing scenes — the occipital place area (OPA), the retrosplenial complex (RSC), and parahippocampal place area (PPA) — might also be involved in generating panoramic memories of a place such as a street corner.
If this were true, when you saw two images of houses that you knew were across the street from each other, they would evoke similar patterns of activity in these specialized brain regions. Two houses from different streets would not induce similar patterns.
“Our hypothesis was that as we begin to build memory of the environment around us, there would be certain regions of the brain where the representation of a single image would start to overlap with representations of other views from the same scene,” Robertson says.
The researchers explored this hypothesis using immersive virtual reality headsets, which allowed them to show people many different panoramic scenes. In this study, the researchers showed participants images from 40 street corners in Boston’s Beacon Hill neighborhood.
The images were presented in two ways: Half the time, participants saw a 100-degree stretch of a 360-degree scene, but the other half of the time, they saw two noncontinuous stretches of a 360-degree scene.
After showing participants these panoramic environments, the researchers then showed them 40 pairs of images and asked if they came from the same street corner. Participants were much better able to determine if pairs came from the same corner if they had seen the two scenes linked in the 100-degree image than if they had seen them unlinked.
Brain scans revealed that when participants saw two images that they knew were linked, the response patterns in the RSC and OPA regions were similar. However, this was not the case for image pairs that the participants had not seen as linked. This suggests that the RSC and OPA, but not the PPA, are involved in building panoramic memories of our surroundings, the researchers say.
Priming the brain
In another experiment, the researchers tested whether one image could “prime” the brain to recall an image from the same panoramic scene. To do this, they showed participants a scene and asked them whether it had been on their left or right when they first saw it. Before that, they showed them either another image from the same street corner or an unrelated image. Participants performed much better when primed with the related image.
“After you have seen a series of views of a panoramic environment, you have explicitly linked them in memory to a known place,” Robertson says. “They also evoke overlapping visual representations in certain regions of the brain, which is implicitly guiding your upcoming perceptual experience.”
About this memory research article
Funding: The research was funded by the National Science Foundation Science and Technology Center for Brains, Minds, and Machines; and the Harvard Milton Fund.
Source: Anne Trafton – MIT Image Source: This NeuroscienceNews.com image is adapted from the MIT press release. Original Research: Abstract for “Neural Representations Integrate the Current Field of View with the Remembered 360° Panorama in Scene-Selective Cortex” by Caroline E. Robertson, Katherine L. Hermann, Anna Mynick, Dwight J. Kravitz, and Nancy Kanwisher in Current Biology. Published online September 8 2016 doi:10.1016/j.cub.2016.07.002
Cite This NeuroscienceNews.com Article
[cbtabs][cbtab title=”MLA”]MIT. “How The Brain Builds Panoramic Memories.” NeuroscienceNews. NeuroscienceNews, 8 September 2016. <https://neurosciencenews.com/panoramic-memory-neuroscience-4988/>.[/cbtab][cbtab title=”APA”]MIT. (2016, September 8). How The Brain Builds Panoramic Memories. NeuroscienceNews. Retrieved September 8, 2016 from https://neurosciencenews.com/panoramic-memory-neuroscience-4988/[/cbtab][cbtab title=”Chicago”]MIT. “How The Brain Builds Panoramic Memories.” https://neurosciencenews.com/panoramic-memory-neuroscience-4988/ (accessed September 8, 2016).[/cbtab][/cbtabs]
Neural Representations Integrate the Current Field of View with the Remembered 360° Panorama in Scene-Selective Cortex
Highlights •Visual experience of a 360° panorama forges memory associations between scene views •Representations of discrete views of a 360° environment overlap in RSC and OPA •The scene currently in view primes associated views of the 360° environment
Summary We experience our visual environment as a seamless, immersive panorama. Yet, each view is discrete and fleeting, separated by expansive eye movements and discontinuous views of our spatial surroundings. How are discrete views of a panoramic environment knit together into a broad, unified memory representation? Regions of the brain’s “scene network” are well poised to integrate retinal input and memory: they are visually driven but also densely interconnected with memory structures in the medial temporal lobe. Further, these regions harbor memory signals relevant for navigation and adapt across overlapping shifts in scene viewpoint. However, it is unknown whether regions of the scene network support visual memory for the panoramic environment outside of the current field of view and, further, how memory for the surrounding environment influences ongoing perception. Here, we demonstrate that specific regions of the scene network—the retrosplenial complex (RSC) and occipital place area (OPA)—unite discrete views of a 360° panoramic environment, both current and out of sight, in a common representational space. Further, individual scene views prime associated representations of the panoramic environment in behavior, facilitating subsequent perceptual judgments. We propose that this dynamic interplay between memory and perception plays an important role in weaving the fabric of continuous visual experience.
“Neural Representations Integrate the Current Field of View with the Remembered 360° Panorama in Scene-Selective Cortex” by Caroline E. Robertson, Katherine L. Hermann, Anna Mynick, Dwight J. Kravitz, and Nancy Kanwisher in Current Biology. Published online September 8 2016 doi:10.1016/j.cub.2016.07.002