Spying on Thousands of Neurons in the Brain’s Vision Center Simultaneously

Summary: A novel, custom-built microscope allowed researchers to track the activity of a single neuron across the entire visual cortex.

Source: HHMI

Using a custom-built microscope to peer into the mouse brain, scientists have tracked the activity of single neurons across the entire visual cortex.

These recordings, made in the tenths of seconds after the animals saw a cue on a screen, expose the complex dynamics involved in making sense of what the eyes see.

In an unprecedented combination of breadth and detail, the results describe the behavior of more than 21,000 total neurons in six mice over five days, Howard Hughes Medical Institute Investigator Mark Schnitzer’s team reports in the journal Nature on May 18, 2022.

His team is the first to get a glimpse of individual cells’ activity occurring at the same time throughout eight parts of the brain involved in vision.

“People have studied these brain areas before, but prior imaging studies did not have cellular resolution across the entire visual cortex,” says Schnitzer, a neuroscientist at Stanford University.

The work highlights the dramatic sequence of events that unfolds in the brain from the instant it receives messages from the eyes until it decides how to respond to that sight. The researchers’ far-reaching but fine-grained imaging approach made it possible for them to collect an “incredible” set of data, says Tatiana Engel, a computational neuroscientist at Cold Spring Harbor Laboratory who was not involved in the study.

While previous studies have already explored aspects of this process, such as variations in single neurons’ activity and coordination between larger brain areas, this research offers an expansive new view, she says. “The scale on which they’re able to address these topics is very impressive.”

When the eyes see an image, they send electrical signals that end up in the visual cortex, the wrinkly outer layer of the brain near the back of the head. There, the signals trigger a flurry of activity as neurons work together to register an image, evaluate it, and decide how to respond.

Credit: HHMI

To capture activity across the visual cortex, Schnitzer and his colleagues built a custom microscope with a wide field of view. Their system could also capture detail at a resolution of a few thousandths of a millimeter, small enough to detect single neurons. By using genetically engineered mice with neurons that fluoresce when sending signals, the team could watch these cells’ activity.

During the team’s experiments, mice had to make a choice based on one of two visual cues. One prompted the animals to lick a spout for some sugar water, the other cue indicated “don’t lick.” The mice performed many of these tests over five days.

With recordings made from the mice’s brains, the team posed a simple question: What happens in the brain when we see something? Their results lay out this invisible process at a time-resolution of fractions of a second and uncover surprising nuances.

This shows a neuron
The work highlights the dramatic sequence of events that unfolds in the brain from the instant it receives messages from the eyes until it decides how to respond to that sight. Image is in the public domain

Scientists, for example, already knew that individual neurons behave variably when responding to visual signals conveyed by the eyes. But Schnitzer’s team’s experiments revealed a pattern to this unreliable behavior. That pattern could make it easier for brain areas receiving the neurons’ signals to make sense of them and accurately interpret the visual scene.

The researchers also documented how, about 200 milliseconds after the visual cue appeared, the animals switched mental gears: messages from the eyes prompted a massive rearrangement in different brain areas’ activity. About 500 milliseconds afterward, this surge subsided and the activity became more stable and recognizable.

Next, roughly 600 milliseconds later, another signal emerged, activating all eight of the brain areas. That signal encoded the animal’s decision to stay still or go for the sugar water. The researchers learned how to read the signal, so they could predict which response the mouse would make.

“It’s fascinating how much the brain is doing in the immediate moments after the eyes see the stimulus,” Schnitzer says.

About this visual neuroscience research news

Author: Press Office
Source: HHMI
Contact: Press Office – HHMI
Image: The image is in the public domain

Original Research: Closed access.
Emergent reliability in sensory cortical coding and inter-area communication” by Mark Schnitzer et al. Nature


Abstract

Emergent reliability in sensory cortical coding and inter-area communication

Reliable sensory discrimination must arise from high-fidelity neural representations and communication between brain areas. However, how neocortical sensory processing overcomes the substantial variability of neuronal sensory responses remains undetermined.

Here we imaged neuronal activity in eight neocortical areas concurrently and over five days in mice performing a visual discrimination task, yielding longitudinal recordings of more than 21,000 neurons.

Analyses revealed a sequence of events across the neocortex starting from a resting state, to early stages of perception, and through the formation of a task response. At rest, the neocortex had one pattern of functional connections, identified through sets of areas that shared activity cofluctuations.

Within about 200 ms after the onset of the sensory stimulus, such connections rearranged, with different areas sharing cofluctuations and task-related information.

During this short-lived state (approximately 300 ms duration), both inter-area sensory data transmission and the redundancy of sensory encoding peaked, reflecting a transient increase in correlated fluctuations among task-related neurons.

By around 0.5 s after stimulus onset, the visual representation reached a more stable form, the structure of which was robust to the prominent, day-to-day variations in the responses of individual cells. About 1 s into stimulus presentation, a global fluctuation mode conveyed the upcoming response of the mouse to every area examined and was orthogonal to modes carrying sensory data.

Overall, the neocortex supports sensory performance through brief elevations in sensory coding redundancy near the start of perception, neural population codes that are robust to cellular variability, and widespread inter-area fluctuation modes that transmit sensory data and task responses in non-interfering channels.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.