Summary: It’s a common reflex: to hear a faint sound better, we squeeze our eyes shut. However, new research suggests this strategy actually backfires in noisy environments.
By monitoring brain activity via EEG, researchers discovered that closing your eyes triggers a state of “neural criticality” that causes the brain to over-filter sound, accidentally silencing the very thing you are trying to hear.
Key Facts
- The Noise Paradox: While closing your eyes may help in a dead-silent room, it impairs your ability to detect signals amid background noise (like a crowded cafe or a busy street).
- Neural Criticality: Closing the eyes shifts the brain into an internal focus that aggressively filters all incoming data. This “over-filtering” suppresses both the background noise and the target sound.
- The Visual Anchor: Seeing a dynamic video that matches the sound significantly boosts hearing sensitivity. Visual engagement acts as an “anchor,” helping the auditory system stay locked onto the external world.
- Selective Attention vs. Multisensory Integration: The boost isn’t just about having eyes open; the brain performs best when visual and audio information are synchronized (e.g., seeing a mouth move while hearing a voice).
- Experimental Design: Volunteers were tested under four conditions: eyes closed, eyes open (blank screen), looking at a still photo, and watching a matching video.
Source: AIP
Most people will close their eyes when trying to concentrate on a faint sound. Many of us have been told that keeping our eyes closed helps us hear better — that it frees up our brains’ processing abilities and increases our auditory sensitivity. However, that strategy may sometimes backfire, particularly in environments with a lot of loud background noise.
In JASA, published on behalf of the Acoustical Society of America by AIP Publishing, researchers from Shanghai Jiao Tong University tested whether a person closing their eyes can really hear better in noisy environments.
To test this, volunteers listened to a collection of sounds through headphones amid background noise. Then, the volunteers adjusted the volume of the sounds until they could barely make them out over the background noise.
This test was conducted first with eyes closed, then with eyes open but looking at only a blank screen, then looking at a still picture corresponding to the sound, and finally, looking at a video matching up with the sound they were trying to hear.
“We found that, contrary to popular belief, closing one’s eyes actually impairs the ability to detect these sounds,” said author Yu Huang. “Conversely, seeing a dynamic video corresponding to the sound significantly improves hearing sensitivity.”
To find an explanation for this result, the researchers attached electroencephalography (EEG) devices to the participants to monitor their brain activity. They determined that closing the eyes puts a participant’s brain in a state of neural criticality, which more aggressively filters noises and quiet sounds, including the target sounds those participants were trying to detect.
“In a noisy soundscape, the brain needs to actively separate the signal from the background,” said Huang.
“We found that the internal focus promoted by eye closure actually works against you in this context, leading to over-filtering, whereas visual engagement helps anchor the auditory system to the external world.”
The authors emphasize that this result is unique to noisy environments. With a calmer background, the conventional strategy of keeping their eyes closed likely does help people detect faint sounds. But because so much of our lives are spent surrounded by noise, it might be better to face the world with eyes wide open.
The researchers plan to continue their work exploring the relationship between vision and hearing.
“Specifically, we want to test incongruent pairings — for example, what happens if you hear a drum but see a bird?” said Huang.
“Does the visual boost come from simply having the eyes open and processing more visual information, or does the brain require the visual and audio information to match perfectly? Understanding this distinction will help us separate the general effects of attention from the specific benefits of multisensory integration.”
Key Questions Answered:
A: Not entirely, but it’s context-dependent. In a quiet room, it works. But in a noisy environment, your brain needs to actively separate the “signal” from the “noise.” Closing your eyes makes your brain retreat inward, causing it to “dim the lights” on all external sensory input, which actually makes the target sound harder to catch.
A: Simply having your eyes open helps prevent the brain from falling into that “over-filtering” state. However, the researchers found that a matching video provides the biggest boost. Your brain uses the visual timing (like seeing a drum stick hit) to “predict” exactly when the sound will happen, making it much easier to pick out.
A: Think of it as your brain’s “squelch” setting on a radio. When your eyes are closed, the brain turns up the squelch so high that it blocks out everything but the loudest signals. By keeping your eyes open, you keep that filter “open” enough to let the subtle sounds through.
Editorial Notes:
- This article was edited by a Neuroscience News editor.
- Journal paper reviewed in full.
- Additional context added by our staff.
About this auditory and visual neuroscience research news
Author: Hannah Daniel
Source: AIP
Contact: Hannah Daniel – AIP
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Visual engagement modulates cortical criticality and auditory target detection thresholds in noisy soundscapes” by Ke Ni, Yu Huang, Yi Wei, and Xu Zhang. Journal of the Acoustical Society of America
DOI:10.1121/10.0042380
Abstract
Visual engagement modulates cortical criticality and auditory target detection thresholds in noisy soundscapes
The widely held assumption that closing eyes enhances auditory sensitivity has been supported by auditory attention experiments. However, the visual effects on auditory thresholds of detecting target sounds masked by noise remain unexplored.
We investigated the participants’ detection thresholds (n = 25) of target sounds (canoe paddle, drum, lark chirping, train, and keyboard) masked by 70 dB(A) pink noise under four visual conditions (eyes closed, eyes open with blank board, static visual stimulation, and dynamic visual stimulation).
Taking blank visual stimulation as the baseline, eye closure elevated detection thresholds by 1.32 dB on average, whereas dynamic and static relevant visual stimulation lowered them by 2.98 and 1.60 dB, respectively, contrary to conventional belief.
Electroencephalogram recordings (n = 27) demonstrated avalanche critical index reduction of 22.3%–45.2% across five auditory stimuli under eye closure compared with blank stimulation, revealing that non-visual states preferentially stabilize neural dynamics near critical states.
We propose a unified auditory-cortical framework based on the brain dynamics theory to explain both the enhanced auditory target detection during visual engagement in noisy environments and optimized auditory segregation via visual disengagement in quiet settings, advancing our understanding of visual effects on auditory perception in complex noisy soundscapes.

