Neuroscience research articles are provided.
What is neuroscience? Neuroscience is the scientific study of nervous systems. Neuroscience can involve research from many branches of science including those involving neurology, brain science, neurobiology, psychology, computer science, artificial intelligence, statistics, prosthetics, neuroimaging, engineering, medicine, physics, mathematics, pharmacology, electrophysiology, biology, robotics and technology.
– These articles focus mainly on neurology research. – What is neurology? – Definition of neurology: a science involved in the study of the nervous systems, especially of the diseases and disorders affecting them. – Neurology research can include information involving brain research, neurological disorders, medicine, brain cancer, peripheral nervous systems, central nervous systems, nerve damage, brain tumors, seizures, neurosurgery, electrophysiology, BMI, brain injuries, paralysis and spinal cord treatments.
What is Psychology? Definition of Psychology: Psychology is the study of behavior in an individual, or group. Psychology news articles are listed below.
Artificial Intelligence articles involve programming, neural engineering, artificial neural networks, artificial life, a-life, floyds, boids, emergence, machine learning, neuralbots, neuralrobotics, computational neuroscience and more involving A.I. research.
Robotics articles will cover robotics research press releases. Robotics news from universities, labs, researchers, engineers, students, high schools, conventions, competitions and more are posted and welcome.
Genetics articles related to neuroscience research will be listed here.
Neurotechnology research articles deal with robotics, AI, deep learning, machine learning, Brain Computer Interfaces, neuroprosthetics, neural implants and more. Read the latest neurotech news articles below.
Summary: Researchers have identified neurons in the visual cortex that can help predict an upcoming visual stimulus.
Georg Keller and his group at the Friedrich Miescher Institute for Biomedical Research (FMI) have identified neurons in the visual cortex whose activity predicts an upcoming visual stimulus. This activity emerges with experience and is integrated with the actual sensory input. What we perceive is thus a combination of what we expect to see and what we actually see.
It has happened to all of us: we fail to notice a friend’s new glasses, our partner’s new hairdo, or a typo in the title of a talk. We simply do not see what is right in front of our eyes. Georg Keller, Junior group leader at the FMI, and his team have now found that these oversights are not due to indifference, but can be explained biologically.
In an elaborate study published in Nature Neuroscience, the scientists show that – with experience – certain neurons in the visual cortex predict specific visual inputs in clearly defined situations. Keller explains: “The brain generates predictions of what we will see. For example, when we see a person day in, day out, we know what to expect and ‘create’ an internal image of that face; these expectations then influence what we actually see.”
In the experiments, mice learned to navigate through a virtual tunnel, with different patterns on the tunnel wall serving as visual cues. As the mice gained experience, a specific response was observed in clearly defined neurons of the visual cortex which was predictive of the upcoming stimulus. According to first author Aris Fiser, a PhD student in the Keller lab: “These cells were active just before a certain visual cue appeared. By looking at these cells, you could predict what the mouse would see next.” The signals from these neurons were then transmitted further within the visual cortex and integrated with the actual sensory signals coming from the eye.
In addition, if the mouse did not encounter an expected stimulus, a subgroup of neurons in the visual cortex selectively and strongly responded to its absence. Keller comments: “We showed that visual processing is strongly influenced by expectations. These expectations are formed through experience and can both suppress and enhance responses to deviations. In this way, visual perception is likely always a combination of what we expect to see and what we actually see.”
[divider]About this neuroscience research article[/divider]
Source: FMI Image Source: NeuroscienceNews.com image is in the public domain. Original Research: Abstract for “Experience-dependent spatial expectations in mouse visual cortex” by Aris Fiser, David Mahringer, Hassana K Oyibo, Anders V Petersen, Marcus Leinweber and Georg B Keller in Nature Neuroscience. Published online September 12 2016 doi:10.1038/nn.4385
[divider]Cite This NeuroscienceNews.com Article[/divider]
[cbtabs][cbtab title=”MLA”]FMI “What You See Is Not Always What You Get.” NeuroscienceNews. NeuroscienceNews, 19 September 2016. <https://neurosciencenews.com/visual-activity-stimuli-5067/>.[/cbtab][cbtab title=”APA”]FMI (2016, September 19). What You See Is Not Always What You Get. NeuroscienceNew. Retrieved September 19, 2016 from https://neurosciencenews.com/visual-activity-stimuli-5067/[/cbtab][cbtab title=”Chicago”]FMI “What You See Is Not Always What You Get.” https://neurosciencenews.com/visual-activity-stimuli-5067/ (accessed September 19, 2016).[/cbtab][/cbtabs]
Experience-dependent spatial expectations in mouse visual cortex
In generative models of brain function, internal representations are used to generate predictions of sensory input, yet little is known about how internal models influence sensory processing. Here we show that, with experience in a virtual environment, the activity of neurons in layer 2/3 of mouse primary visual cortex (V1) becomes increasingly informative of spatial location. We found that a subset of V1 neurons exhibited responses that were predictive of the upcoming visual stimulus in a spatially dependent manner and that the omission of an expected stimulus drove strong responses in V1. Stimulus-predictive responses also emerged in V1-projecting anterior cingulate cortex axons, suggesting that anterior cingulate cortex serves as a source of predictions of visual input to V1. These findings are consistent with the hypothesis that visual cortex forms an internal representation of the visual scene based on spatial location and compares this representation with feed-forward visual input.
“Experience-dependent spatial expectations in mouse visual cortex” by Aris Fiser, David Mahringer, Hassana K Oyibo, Anders V Petersen, Marcus Leinweber and Georg B Keller in Nature Neuroscience. Published online September 12 2016 doi:10.1038/nn.4385
[divider]Feel free to share this Neuroscience News.[/divider]