Summary: A new study provides insight into how the brain processes visual information to help control behavior.
Source: Rockefeller University.
What you see is not always what you get. And that, researchers at The Rockefeller University have discovered, is a good thing.
“Every time you move your eye, the whole world moves on your retina,” says Gaby Maimon, head of the Laboratory of Integrative Brain Function. “But you don’t perceive an earthquake happening several times a second.”
That’s because the brain can tell if visual motion is self-generated, canceling out information that would otherwise make us feel—and act—as if the world was whirling around us. It’s an astonishing bit of neural computation—one that Maimon and his team are attempting to decode in fruit flies. And the results of their most recent investigations, published in Cell on January 5, provide fresh insights into how the brain processes visual information to control behavior.
Each time you shift your gaze (and you do so several times a second), the brain sends a command to the eyes to move. But a copy of that command is issued internally to the brain’s own visual system, as well.
This allows the brain to predict that it is about to receive a flood of visual information resulting from the body’s own movement—and to compensate for it by suppressing or enhancing the activity of particular neurons.
The human brain contains approximately 80 billion neurons, however, complicating the task of determining precisely how it makes such predictions and alters our perception at the cellular level.
Fortunately, the common fruit fly performs the same kinds of rapid eye movements. The mere 100,000 neurons in its poppy-seed sized brain must therefore handle the same problems of prediction and perception—but at a scale that Maimon and his colleagues, research associate Anmo Kim and postdoctoral fellow Lisa Fenk, can study in intimate detail.
There are differences between humans and flies, of course. For one thing, a fly’s eyes are bolted to its head. To shift its gaze, it must therefore maneuver like a tiny airplane. And like an airplane, it can rotate around multiple axes, including yaw and roll.
Yet its brain still manages to distinguish between expected and unexpected visual motion.
When a gust of wind unexpectedly blows a fly off course, for example, a powerful reflex known as the optomotor response causes the insect’s head to rotate in the opposite direction, snapping its eyes back toward their original target. The fly also stabilizes its flight path by using its wings to execute a counter-turn.
If a fly intentionally turns to shift its gaze, however, something different occurs. The urge to rotate its head and body back toward the original flight direction is somehow suppressed. Otherwise, it would never be able shift its gaze at all.
But how does a brain with such limited horsepower finesse such a complex problem?
In a previous study, Kim and Maimon demonstrated that two groups of motion-sensitive neurons in the fly’s visual system are suppressed during rapid intentional turns, inhibiting the insect’s behavioral responses.
In the Cell study, Kim, Fenk and Maimon showed that one of these sets of neurons stabilizes the head during flight turns. And they determined how it does so by measuring the electrical activity in individual neurons and filming the motions of the flies’ heads and wings as they turned on purpose—or were tricked into believing that they had turned by accident. (In some of the experiments, the flies were glued to a miniscule platform and shown images on an LED screen that deceived them into thinking that their gaze had shifted unintentionally.)
Each of the neurons in question could respond to visual motion around several axes. Some were more sensitive to yaw, however, and others to roll.
And that’s where things got interesting.
During intentional turns, each neuron received a signal that was carefully calibrated to suppress sensitivity to visual motion along the yaw axis alone.
Neurons that were more sensitive to yaw got a stronger countervailing signal. Neurons that were less sensitive got a weaker one. Sensitivity to roll, meanwhile, was left unimpaired.
As Maimon explains, this makes sense because flies must first roll and then counter-roll to properly execute intentional turns. If they were to counter-yaw, however, they would never be able to head off in a new direction.
The neural silencing process described by the researchers therefore left the flies selectively blind to visual information that would otherwise have interfered with their ability to turn—a feat of neural computation that Maimon likens to tuning out the sound of a single instrument in an entire band.
It’s the first illustration of how brains can subtract just one component of a complex sensory signal carried by an entire population of neurons while leaving other signals in the same population untouched. And it provides a blueprint for understanding how the brains of larger creatures might manage the same kinds of problems.
For while the details of how the brain modulates visual perception might differ in animals whose skulls are packed with more neurons, says Maimon, “we would expect to see similar processes in mammalian brains—including our own.”
About this psychology research article
Funding: This work was supported by the New York Stem Cell Foundation (NYSCF-RQ1NI13), the Searle Scholars Foundation, and the National Institute on Drug Abuse of the NIH (DP2DA035148). Gaby Maimon. is a New York Stem Cell Foundation-Robertson Investigator. Lisa Fenk was supported by funds from a Leon Levy Fellowship in Mind, Brain, and Behavior at The Rockefeller University.
Source:Rockefeller University Image Source: NeuroscienceNews.com image is adapted from the Rockefeller University press release. Original Research:Abstract for “Quantitative Predictions Orchestrate Visual Signaling in Drosophila” by Anmo J. Kim, Lisa M. Fenk, Cheng Lyu, and Gaby Maimon in Cell. Published online January 5 2017 doi:10.1016/j.cell.2016.12.005
Cite This NeuroscienceNews.com Article
[cbtabs][cbtab title=”MLA”]Rockefeller University “How the Brain Shapes Perception to Control Behavior.” NeuroscienceNews. NeuroscienceNews, 6 January 2017. <https://neurosciencenews.com/perception-behavior-neuroscience-5877/>.[/cbtab][cbtab title=”APA”]Rockefeller University (2017, January 6). How the Brain Shapes Perception to Control Behavior. NeuroscienceNew. Retrieved January 6, 2017 from https://neurosciencenews.com/perception-behavior-neuroscience-5877/[/cbtab][cbtab title=”Chicago”]Rockefeller University “How the Brain Shapes Perception to Control Behavior.” https://neurosciencenews.com/perception-behavior-neuroscience-5877/ (accessed January 6, 2017).[/cbtab][/cbtabs]
Quantitative Predictions Orchestrate Visual Signaling in Drosophila
Highlights •Optic flow-processing neurons participate in controlling head stability responses •During flight turns, these neurons receive precisely tuned motor-related inputs •These inputs suppress specific visual responses while preserving other responses •These modulations of visual signaling mute maladaptive head movements during turns
Summary Vision influences behavior, but ongoing behavior also modulates vision in animals ranging from insects to primates. The function and biophysical mechanisms of most such modulations remain unresolved. Here, we combine behavioral genetics, electrophysiology, and high-speed videography to advance a function for behavioral modulations of visual processing in Drosophila. We argue that a set of motion-sensitive visual neurons regulate gaze-stabilizing head movements. We describe how, during flight turns, Drosophila perform a set of head movements that require silencing their gaze-stability reflexes along the primary rotation axis of the turn. Consistent with this behavioral requirement, we find pervasive motor-related inputs to the visual neurons, which quantitatively silence their predicted visual responses to rotations around the relevant axis while preserving sensitivity around other axes. This work proposes a function for a behavioral modulation of visual processing and illustrates how the brain can remove one sensory signal from a circuit carrying multiple related signals.
“Quantitative Predictions Orchestrate Visual Signaling in Drosophila” by Anmo J. Kim, Lisa M. Fenk, Cheng Lyu, and Gaby Maimon in Cell. Published online January 5 2017 doi:10.1016/j.cell.2016.12.005