Imagine you are looking for your wallet on a cluttered desk. As you scan the area, you hold in your mind a mental picture of what your wallet looks like.
MIT neuroscientists have now identified a brain region that stores this type of visual representation during a search. The researchers also found that this region sends signals to the parts of the brain that control eye movements, telling individuals where to look next.
This region, known as the ventral pre-arcuate (VPA), is critical for what the researchers call “feature attention,” which allows the brain to seek objects based on their specific properties. Most previous studies of how the brain pays attention have investigated a different type of attention known as spatial attention — that is, what happens when the brain focuses on a certain location.
“The way that people go about their lives most of the time, they don’t know where things are in advance. They’re paying attention to things based on their features,” says Robert Desimone, director of MIT’s McGovern Institute for Brain Research. “In the morning you’re trying to find your car keys so you can go to work. How do you do that? You don’t look at every pixel in your house. You have to use your knowledge of what your car keys look like.”
Desimone, also the Doris and Don Berkey Professor in MIT’s Department of Brain and Cognitive Sciences, is the senior author of a paper describing the findings in the Oct. 29 online edition of Neuron. The paper’s lead author is Narcisse Bichot, a research scientist at the McGovern Institute. Other authors are Matthew Heard, a former research technician, and Ellen DeGennaro, a graduate student in the Harvard-MIT Division of Health Sciences and Technology.
The researchers focused on the VPA in part because of its extensive connections with the brain’s frontal eye fields, which control eye movements. Located in the prefrontal cortex, the VPA has previously been linked with working memory — a cognitive ability that helps us to gather and coordinate information while performing tasks such as solving a math problem or participating in a conversation.
“There have been a lot of studies showing that this region of the cortex is heavily involved in working memory,” Bichot says. “If you have to remember something, cells in these areas are involved in holding the memory of that object for the purpose of identifying it later.”
In the new study, the researchers found that the VPA also holds what they call an “attentional template” — that is, a memory of the item being sought.
In this study, the researchers first showed monkeys a target object, such as a human face, a banana, or a butterfly. After a delay, they showed an array of objects that included the target. When the animal fixed its gaze on the target object, it received a reward. “The animals can look around as long as they want until they find what they’re looking for,” Bichot says.
As the animals performed the task, the researchers recorded electrical activity from neurons in the VPA. Each object produced a distinctive pattern of neural activity, and the neurons that encoded a representation of the target object stayed active until a match was found, prompting the neurons to fire even more.
“When the target object finally enters their receptive fields, they give enhanced responses,” Desimone says. “That’s the signal that the thing they’re looking for is actually there.”
About 20 to 30 milliseconds after the VPA cells respond to the target object, they send a signal to the frontal eye fields, which direct the eyes to lock onto the target.
When the researchers blocked VPA activity, they found that although the animals could still move their eyes around in search of the target object, they could not find it. “Presumably it’s because they’ve lost this mechanism for telling them where the likely target is,” Desimone says.
The researchers believe the VPA may be the equivalent in nonhuman primates of a human brain region called the inferior frontal junction (IFJ). Last year Desimone and postdoc Daniel Baldauf found that the IFJ holds onto the idea of a target object — in that study, either faces or houses — and then directs the correct part of the brain to look for the target.
The researchers are now studying how the VPA interacts with a nearby region called the VPS, which appears to be more important for tasks in which attention must be switched quickly from one object to another. They are also performing additional studies of human attention, in hopes of learning more about disorders such as Attention Deficit Hyperactivity Disorder and other attention disorders.
“There’s really an opportunity there to understand something important about the role of the prefrontal cortex in both normal behavior and in brain disorders,” Desimone says.
About this neuroscience and memory research
Source: Anne Trafton – MIT Image Credit: The image is credited to Jose-Luis Olivares/MIT Original Research:Abstract for “A Source for Feature-Based Attention in the Prefrontal Cortex” by Narcisse P. Bichot, Matthew T. Heard, Ellen M. DeGennaro, and Robert Desimone in Neuron. Published online October 29 2015 doi:10.1016/j.neuron.2015.10.001
A Source for Feature-Based Attention in the Prefrontal Cortex
Highlights •Prefrontal cortex plays a key role in finding objects based on visual features •Neurons in the VPA region of PFC exhibit the earliest times of feature selection •Deactivation of VPA impairs the ability to find objects based on their features •VPA appears to be the source of feature selection in FEF, but not spatial selection
Summary In cluttered scenes, we can use feature-based attention to quickly locate a target object. To understand how feature attention is used to find and select objects for action, we focused on the ventral prearcuate (VPA) region of prefrontal cortex. In a visual search task, VPA cells responded selectively to search cues, maintained their feature selectivity throughout the delay and subsequent saccades, and discriminated the search target in their receptive fields with a time course earlier than in FEF or IT cortex. Inactivation of VPA impaired the animals’ ability to find targets, and simultaneous recordings in FEF revealed that the effects of feature attention were eliminated while leaving the effects of spatial attention in FEF intact. Altogether, the results suggest that VPA neurons compute the locations of objects with the features sought and send this information to FEF to guide eye movements to those relevant stimuli.
“Rapid Modulation of Axon Initial Segment Length Influences Repetitive Spike Firing” by Mark D. Evans, Adna S. Dumitrescu, Dennis L.H. Kruijssen, Samuel E. Taylor, Matthew S. Grubb in Cell Reports. Published online October 29 2015 doi:10.1016/j.celrep.2015.09.066