Summary: A new study reveals that visual clutter alters how information flows between neurons in the brain’s primary visual cortex, but not the order in which it’s processed. Researchers found that the efficiency of information transfer changes depending on the location of clutter in the visual field.
This discovery offers insights into “visual crowding,” the phenomenon that makes it difficult to identify objects in cluttered environments, especially in peripheral vision. The findings deepen our understanding of the neural basis of perception and could inform future research on attention and brain function.
Key Facts:
- Visual clutter affects how efficiently information flows in the visual cortex.
- The study helps explain “visual crowding,” making object recognition harder in clutter.
- Different locations of visual clutter impact the flow but not the order of information transfer.
Source: Yale
Whether we’re staring at our phones, the page of a book, or the person across the table, the objects of our focus never stand in isolation; there are always other objects or people in our field of vision. How that visual “clutter” affects visual processing in the brain, however, is not well understood.
In a new study published Oct. 22 in the journal Neuron, Yale researchers show that this clutter alters how information flows in the brain, as does the precise location of that clutter within the wider field of vision. The findings help clarify the neural basis of perception and offer a deeper understanding of the visual cortex in the brain.
“Prior research has shown that visual clutter has an effect on the target of your perception, and to different degrees depending on where that clutter is with respect to where you’re currently looking,” said Anirvan Nandy, an assistant professor of neuroscience at Yale School of Medicine (YSM) and co-senior author of the study.
“So for example, if I’m asked to read the word ‘cat’ out of the corner of my eye, the letter ‘t’ will have a much greater effect than the letter ‘c’ in my inability to accurately identify the letter ‘a,’ even though ‘c’ and ‘t’ are equidistant from ‘a.’”
This phenomenon is called “visual crowding,” and it’s why we can’t read out of the corner of our eyes, no matter how hard we try, and why we have a hard time identifying objects when they are located among the clutter at the edge of our vision, said Nandy.
For the new study, researchers set out to determine what happens in the brain when this visual clutter is present.
To do so, they trained macaque monkeys — a species whose visual systems and abilities are most similar to humans — to fixate on the center of a screen while visual stimuli were presented in and outside of their receptive fields. During this task, researchers recorded neural activity in the monkey’s primary visual cortex, the brain’s main gateway for visual information processing.
The researchers found that the specific location of this clutter within the monkey’s visual field didn’t have much of an effect on how information was passed between neurons in the primary visual cortex. It did, however, affect how efficiently that information flowed.
It’s kind of like a phone tree, in which individuals are asked to call one other person to relay a piece of information until, one after another, every member of the group has received the information.
In the case of visual perception, researchers say, the location of visual clutter didn’t change the order of the phone tree, but it did change how well the message was relayed person to person.
“For example, visual clutter in one location would drive information in a particular layer of the primary visual cortex to a lesser extent than clutter in another location,” said Monika Jadi, assistant professor of psychiatry at YSM and co-senior author of the study.
The researchers also uncovered a general property of the visual cortex that was not previously known.
There are several brain areas involved in seeing and recognizing an object, and information is passed through those regions in a particular order. For instance, the primary visual cortex sends a package of information onto the secondary visual cortex, which then sends its information onto the next stop.
“What was already well understood is that there are complex computations that take place within individual visual areas and the output of these computations are then transferred to the next area along the visual hierarchy to complete the whole object recognition computation,” said Jadi.
What the researchers found in the new study was that there are also subunits within these larger areas that are doing their own computations and relaying some, but not all, of that information to other subunits. The finding bridges a disconnect that had existed between different fields studying vision, said Nandy.
The researchers are now interested in how clutter might affect information processing between brain regions and how attention influences this system.
“When you’re driving, for instance, you may be looking at the car in front of you, but your attention could be focused on a car in the next lane as you try to determine if they’re about to merge,” said Nandy.
Therefore, the detailed visual information you’re getting is from the car in front of you, but the information of interest is outside of your focus.
“How does that attention compensate for the fact that while you don’t have the best resolution information, you’re still able to perceive that attended part of the visual space much better than where you’re actually looking?” said Jadi.
“How does attention influence information flow in the cortex? That’s what we want to explore.”
Xize Xu, a post-doctoral fellow at YSM, and Mitchell Morton, a former post-doctoral associate at YSM, were co-first authors of the study.
About this visual neuroscience research news
Author: Bess Connolly
Source: Yale
Contact: Bess Connolly – Yale
Image: The image is credited to Neuroscience News
Original Research: Closed access.
“Spatial context non-uniformly modulates inter-laminar information flow in the primary visual cortex” by Anirvan Nandy et al. Neuron
Abstract
Spatial context non-uniformly modulates inter-laminar information flow in the primary visual cortex
Our visual experience is a result of the concerted activity of neuronal ensembles in the sensory hierarchy. Yet, how the spatial organization of objects influences this activity remains poorly understood.
We investigate how inter-laminar information flow within the primary visual cortex (V1) is affected by visual stimuli in isolation or with flankers at spatial configurations that are known to cause non-uniform degradation of perception.
By employing dimensionality reduction approaches to simultaneous, layer-specific population recordings, we establish that information propagation between cortical layers occurs along a structurally stable communication subspace.
The spatial configuration of contextual stimuli differentially modulates inter-laminar communication efficacy, the balance of feedforward and effective feedback signaling, and contextual signaling in the superficial layers. Remarkably, these modulations mirror the spatially non-uniform aspects of perceptual degradation.
Our results suggest a model of retinotopically non-uniform cortical connectivity in the output layers of V1 that influences information flow in the sensory hierarchy.