How The Brain Merges The Senses

Summary: Researchers have proposed a computational model that could help explain multisensory integration in humans.

Source: Bielefeld University.

Scientists from Cluster of Excellence CITEC unveil the mechanisms of multisensory processing in a new publication in Nature Communications.

Utilizing information from all the senses is critical for building a robust and rich representation of our surroundings. Given the wealth of multisensory information constantly bombarding us, however, how does our brain know which signals go together and thus need to be combined? And how does it integrate such related signals? Scientists from the Cluster of Excellence Cognitive Interaction Technology (CITEC) at Bielefeld University and the Max Planck Institute for Biological Cybernetics have proposed a computational model that explains multisensory integration in humans utilizing a surprisingly simple processing unit. This research, funded by the Bernstein Center for Computational Neuroscience, appears in the current issue of Nature Communications.

A sudden explosion, cracking sounds and flashing lights. In a blink of an eye, you realize that sounds and lights belong together, you look down and see firecrackers on the sidewalk. The human brain is surprisingly efficient at processing multisensory information. However, we still do not know how it solves the seemingly simple task of deciding whether sound and light belong together or not. “Figuring out a correspondence between the senses is by no means a trivial problem” says Dr. Cesare Parise, who works at CITEC in the research group Cognitive Neurosciences. Parise, who is also active at the Max-Planck Institute for Biological Cybernetics, is the lead author of the new study, which he wrote together with Professor Dr. Marc Ernst, who conducted research at Bielefeld University through March 2016. “Despite originating from the same physical events, visual and auditory information are processed in largely independent neural pathways, and yet, with no apparent effort, we can instantly tell which signals belong together. Such a task would be challenging, even for the most advanced robots”.

To understand how humans combine visual and auditory information, volunteers agreed to par-ticipate to a perception experiment in which they observed random sequences of clicks and flashes. After each sequence, they had to report whether sound and light perceptually be-longed together, and which signal appeared first. Statistical analyses revealed that human responses were systematically determined by the similarity (i.e., correlation) of the temporal sequences of the clicks and flashes. “This is a very important finding,” says Prof. March Ernst, “not just because it shows that the brain uses the temporal correlation of sound and light to detect whether or not they are physically related, but also because it opens an even more intriguing question: how does the brain detect correlation across the senses?”

To answer this question, Parise and Ernst used computational modeling and computer simulations, and identified an elementary neural mechanism that could closely replicate human perception. Such a mechanism – called the Multisensory Correlation Detector – monitors the senses and looks for similarity (correlation) across visual and auditory signals: if the stimuli have a similar temporal structure, the brain concludes that they belong together, and integrates the stimuli. Remarkably enough, this mechanism is surprisingly similar to the motion detectors found in the insect brain.

“This is exciting because it shows that the brain systematically exploits general-purpose processing strategies, which can be implemented across very different domains of perception where the correlation between signals is a key feature, such as the perception of visual motion, 3-D perception using binocular disparities, binaural hearing, and now multisensory processing. Furthermore, such correlation mechanisms can be found in very different animal species, from insects to vertebrates, including humans”, says Prof. Marc Ernst, who has just accepted a new position at Ulm University. To further test the generalizability of this model, Parise and Ernst ran additional computer simulations, where they used the Multisensory Correlation Detector model to replicate several previous findings on the temporal and the spatial aspects of multisensory perception. Without further changes, the same model proved capable of replicating human perception in all simulated studies, and displayed the same temporal and spatial constraints of multisensory perception found in humans.

Image shows a person undergoing the experiment.
In his experiments, Dr. Cesare Parise had test subjects observe and evaluate random flashes of light and clicks to uncover out how the brain connects sensory stimuli — and keeps them apart.NeuroscienceNews.com image is credited to CITEC/ Bielefeld University.

“Over the last decade we have discovered that the brain integrates multisensory information in a statistically optimal fashion. However, the nature of the underlying neural mechanisms has so far defied proper scientific explanation”, says Dr. Cesare Parise. “This study marks a milestone in our understanding of human perception, as it provides for the first time a general mechanism capable of explaining a large variety of findings in multisensory perception”.

“This result has strong application potential” says Dr. Parise, who has just accepted a new position as research scientist at Oculus VR (Facebook): “A deep understanding of multisensory processing opens new clinical perspectives for neurological syndromes that are associated with multisensory impairments, such as Autism Spectrum Disorder and Dyslexia. Moreover, our computational model could be easily implemented for use in robots and artificial perception”.

About this neuroscience research article

Funding: This study is part of the research programs of the Bernstein Center for Computational Neuroscience, Tübingen, funded by the German Federal Ministry of Education and Research (German Federal Ministry of Education and Research; Förderkennzeichen: 01GQ1002), the 7th Framework Programme European Project ‘Wearhap’ (601165), and the Deutsche Forschungsgemeinschaft Excellence Cluster Cognitive Interaction Technology—Cluster of Excellence (CITEC: ECX 277).

Source: Dr. Cesare V. Parise – Bielefeld University
Image Source: This NeuroscienceNews.com image is credited to CITEC/ Bielefeld University.
Original Research: Full open access research for “Correlation detection as a general mechanism for multisensory integration” by Cesare V. Parise and Marc O. Ernst in Nature Communications. Published online June 6 2016 doi:10.1038/ncomms11543

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]Bielefeld University. “How The Brain Merges The Senses.” NeuroscienceNews. NeuroscienceNews, 6 June 2016.
<https://neurosciencenews.com/fiber-aging-neuroscience-4476/>.[/cbtab][cbtab title=”APA”]Bielefeld University. (2016, June 6). How The Brain Merges The Senses. NeuroscienceNews. Retrieved June 6, 2016 from https://neurosciencenews.com/fiber-aging-neuroscience-4476/[/cbtab][cbtab title=”Chicago”]Bielefeld University. “How The Brain Merges The Senses.” https://neurosciencenews.com/fiber-aging-neuroscience-4476/ (accessed June 6, 2016).[/cbtab][/cbtabs]


Abstract

Correlation detection as a general mechanism for multisensory integration

The brain efficiently processes multisensory information by selectively combining related signals across the continuous stream of multisensory inputs. To do so, it needs to detect correlation, lag and synchrony across the senses; optimally integrate related information; and dynamically adapt to spatiotemporal conflicts across the senses. Here we show that all these aspects of multisensory perception can be jointly explained by postulating an elementary processing unit akin to the Hassenstein–Reichardt detector—a model originally developed for visual motion perception. This unit, termed the multisensory correlation detector (MCD), integrates related multisensory signals through a set of temporal filters followed by linear combination. Our model can tightly replicate human perception as measured in a series of empirical studies, both novel and previously published. MCDs provide a unified general theory of multisensory processing, which simultaneously explains a wide spectrum of phenomena with a simple, yet physiologically plausible model.

“Correlation detection as a general mechanism for multisensory integration” by Cesare V. Parise and Marc O. Ernst in Nature Communications. Published online June 6 2016 doi:10.1038/ncomms11543

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.