Summary: A new study reveals visual cues, such as looking at someone’s lips as they talk, helps our brains to amplify sound.

Source: UCL.

Looking at someone’s lips is good for listening in noisy environments because it helps our brains amplify the sounds we’re hearing in time with what we’re seeing, finds a new UCL-led study.

The researchers say their findings, published in Neuron, could be relevant to people with hearing aids or cochlear implants, as they tend to struggle hearing conversations in noisy places like a pub or restaurant.

The researchers found that visual information is integrated with auditory information at an earlier, more basic level than previously believed, independent of any conscious or attention-driven processes. When information from the eyes and ears is temporally coherent, the auditory cortex – the part of the brain responsible for interpreting what we hear – boosts the relevant sounds that tie in with what we’re looking at.

“While the auditory cortex is focused on processing sounds, roughly a quarter of its neurons respond to light – we helped discover that a decade ago, and we’ve been trying to figure out why that’s the case ever since,” said the study’s lead author, Dr Jennifer Bizley (UCL Ear Institute).

In a 2015 study, she and her team found that people can pick apart two different sounds more easily if the one they’re trying to focus on happens in time with a visual cue. For this latest study, the researchers presented the same auditory and visual stimuli to ferrets while recording their neural activity. When one of the auditory streams changed in amplitude in conjunction with changes in luminance of the visual stimulus, more of the neurons in the auditory cortex reacted to that sound.

a mouth
The researchers say their findings could help develop training strategies for people with hearing loss, as they have had early success in helping people tap into their brain’s ability to link up sound and sight. NeuroscienceNews.com image is adapted from the UCL news release.

“Looking at someone when they’re speaking doesn’t just help us hear because of our ability to recognise lip movements – we’ve shown it’s beneficial at a lower level than that, as the timing of the movements aligned with the timing of the sounds tells our auditory neurons which sounds to represent more strongly. If you’re trying to pick someone’s voice out of background noise, that could be really helpful,” said Dr Bizley.

The researchers say their findings could help develop training strategies for people with hearing loss, as they have had early success in helping people tap into their brain’s ability to link up sound and sight. The findings could also help hearing aid and cochlear implant manufacturers develop smarter ways to amplify sound by linking it to the person’s gaze direction.

The paper adds to evidence that people who are having trouble hearing should get their eyes tested as well.

About this neuroscience research article

Funding: The study was led by Dr Bizley and PhD student Huriye Atilgan (UCL Ear Institute) alongside researchers from UCL, the University of Rochester and the University of Washington, and was funded by Wellcome, the Royal Society, the Biotechnology and Biological Sciences Research Council, Action on Hearing Loss, the National Institutes of Health and the Hearing Health Foundation.

Source: Chris Lane – UCL
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is adapted from the UCL news release.
Original Research: Open access research in Neuron.
doi:10.1016/j.neuron.2017.12.034

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]UCL “Visual Cues Amplify Sound.” NeuroscienceNews. NeuroscienceNews, 15 February 2018.
<https://neurosciencenews.com/visual-cues-sound-amplification-8493/>.[/cbtab][cbtab title=”APA”]UCL (2018, February 15). Visual Cues Amplify Sound. NeuroscienceNews. Retrieved February 15, 2018 from https://neurosciencenews.com/visual-cues-sound-amplification-8493/[/cbtab][cbtab title=”Chicago”]UCL “Visual Cues Amplify Sound.” https://neurosciencenews.com/visual-cues-sound-amplification-8493/ (accessed February 15, 2018).[/cbtab][/cbtabs]


Abstract

Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding

Highlights
•Visual stimuli can shape how auditory cortical neurons respond to sound mixtures
•Temporal coherence between senses enhances sound features of a bound multisensory object
•Visual stimuli elicit changes in the phase of the local field potential in auditory cortex
•Vision-induced phase effects are lost when visual cortex is reversibly silenced

Summary
How and where in the brain audio-visual signals are bound to create multimodal objects remains unknown. One hypothesis is that temporal coherence between dynamic multisensory signals provides a mechanism for binding stimulus features across sensory modalities. Here, we report that when the luminance of a visual stimulus is temporally coherent with the amplitude fluctuations of one sound in a mixture, the representation of that sound is enhanced in auditory cortex. Critically, this enhancement extends to include both binding and non-binding features of the sound. We demonstrate that visual information conveyed from visual cortex via the phase of the local field potential is combined with auditory information within auditory cortex. These data provide evidence that early cross-sensory binding provides a bottom-up mechanism for the formation of cross-sensory objects and that one role for multisensory binding in auditory cortex is to support auditory scene analysis.

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.