Summary: A new study reveals visual cues, such as looking at someone’s lips as they talk, helps our brains to amplify sound.
Looking at someone’s lips is good for listening in noisy environments because it helps our brains amplify the sounds we’re hearing in time with what we’re seeing, finds a new UCL-led study.
The researchers say their findings, published in Neuron, could be relevant to people with hearing aids or cochlear implants, as they tend to struggle hearing conversations in noisy places like a pub or restaurant.
The researchers found that visual information is integrated with auditory information at an earlier, more basic level than previously believed, independent of any conscious or attention-driven processes. When information from the eyes and ears is temporally coherent, the auditory cortex – the part of the brain responsible for interpreting what we hear – boosts the relevant sounds that tie in with what we’re looking at.
“While the auditory cortex is focused on processing sounds, roughly a quarter of its neurons respond to light – we helped discover that a decade ago, and we’ve been trying to figure out why that’s the case ever since,” said the study’s lead author, Dr Jennifer Bizley (UCL Ear Institute).
In a 2015 study, she and her team found that people can pick apart two different sounds more easily if the one they’re trying to focus on happens in time with a visual cue. For this latest study, the researchers presented the same auditory and visual stimuli to ferrets while recording their neural activity. When one of the auditory streams changed in amplitude in conjunction with changes in luminance of the visual stimulus, more of the neurons in the auditory cortex reacted to that sound.
“Looking at someone when they’re speaking doesn’t just help us hear because of our ability to recognise lip movements – we’ve shown it’s beneficial at a lower level than that, as the timing of the movements aligned with the timing of the sounds tells our auditory neurons which sounds to represent more strongly. If you’re trying to pick someone’s voice out of background noise, that could be really helpful,” said Dr Bizley.
The researchers say their findings could help develop training strategies for people with hearing loss, as they have had early success in helping people tap into their brain’s ability to link up sound and sight. The findings could also help hearing aid and cochlear implant manufacturers develop smarter ways to amplify sound by linking it to the person’s gaze direction.
The paper adds to evidence that people who are having trouble hearing should get their eyes tested as well.
Funding: The study was led by Dr Bizley and PhD student Huriye Atilgan (UCL Ear Institute) alongside researchers from UCL, the University of Rochester and the University of Washington, and was funded by Wellcome, the Royal Society, the Biotechnology and Biological Sciences Research Council, Action on Hearing Loss, the National Institutes of Health and the Hearing Health Foundation.
Source: Chris Lane – UCL
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is adapted from the UCL news release.
Original Research: Open access research in Neuron.
Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding
•Visual stimuli can shape how auditory cortical neurons respond to sound mixtures
•Temporal coherence between senses enhances sound features of a bound multisensory object
•Visual stimuli elicit changes in the phase of the local field potential in auditory cortex
•Vision-induced phase effects are lost when visual cortex is reversibly silenced
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis.