Unconsciously “Hearing” Distance

Because sound travels much more slowly than light, we can often see distant events before we hear them. That is why we can count the seconds between a lightning flash and its thunder to estimate their distance.

But new research from the University of Rochester reveals that our brains can also detect and process sound delays that are too short to be noticed consciously. And they found that we use even that unconscious information to fine tune what our eyes see when estimating distances to nearby events.

“Much of the world around us is audiovisual,” said Duje Tadin, associate professor of brain and cognitive sciences at the University of Rochester and senior author of the study. “Although humans are primarily visual creatures, our research shows that estimating relative distance is more precise when visual cues are supported with corresponding auditory signals. Our brains recognize those signals even when they are separated from visual cues by a time that is too brief to consciously notice.”

Tadin and his colleagues have discovered that humans can unconsciously notice and make use of sound delays as short as 40 milliseconds (ms).

“Our brains are very good at recognizing patterns that can help us,” said Phil Jaekl, who conducted the research while a postdoctoral researcher in Tadin’s lab. “Now we also know that humans can unconsciously recognize the link between sound delays and visual distance, and then combine that information in a useful way.”

For the study, published in PLOS ONE, the researchers used projected three-dimensional (3D) images to test the human brain’s ability to use sound delays to estimate the relative distance of objects.

In the first experiment, participants were asked to adjust the relative depth of two identical shapes until they appeared to be at the same distance when viewed through 3D glasses. Each shape was paired with an audible “click.” The click came either just before the shape appeared, or slightly after it appeared—by an equally brief time.

Participants consistently perceived the shape that was paired with the delayed click as being more distant. “This surprised us,” said Jaekl. “When the 3D shapes were the same distance, participants were consistently biased by the sound delay to judge the shape paired with the delayed click as being further away—even though it wasn’t.”

Image of lightening.
Because sound travels much more slowly than light, we can often see distant events before we hear them. That is why we can count the seconds between a lightning flash and its thunder to estimate their distance. Image is for illustrative purposes only.

In the second experiment, participants were shown three-dimensional shapes that were quickly repositioned either toward or away from the participant. When the shape was paired with a sound delayed by 42 ms, participants were more likely to perceive it as more distant—even in cases when the object was actually shifted toward the participant. Most importantly, when an object that was shifted away was paired with the sound delay—a pairing consistent with the natural world—participants were able to judge relative distance with greater precision.

“It’s striking that this bias is unconscious—participants were unable to consciously detect when sound delays were present, yet it had great influence over their perception of distance,” said Jaekl, who is currently conducting research at the University of Rochester Medical Center.

About this neuroscience research

Source: Monique Patenaude – University of Rochester
Image Source: The image is in the public domain
Original Research: Full open access research for “Audiovisual Delay as a Novel Cue to Visual Distance” by Philip Jaekl, Jakob Seidlitz, Laurence R. Harris, and Duje Tadin in PLOS ONE. Published online October 9 2015 doi:10.1371/journal.pone.0141125


Abstract

Audiovisual Delay as a Novel Cue to Visual Distance

For audiovisual sensory events, sound arrives with a delay relative to light that increases with event distance. It is unknown, however, whether humans can use these ubiquitous sound delays as an information source for distance computation. Here, we tested the hypothesis that audiovisual delays can both bias and improve human perceptual distance discrimination, such that visual stimuli paired with auditory delays are perceived as more distant and are thereby an ordinal distance cue. In two experiments, participants judged the relative distance of two repetitively displayed three-dimensional dot clusters, both presented with sounds of varying delays. In the first experiment, dot clusters presented with a sound delay were judged to be more distant than dot clusters paired with equivalent sound leads. In the second experiment, we confirmed that the presence of a sound delay was sufficient to cause stimuli to appear as more distant. Additionally, we found that ecologically congruent pairing of more distant events with a sound delay resulted in an increase in the precision of distance judgments. A control experiment determined that the sound delay duration influencing these distance judgments was not detectable, thereby eliminating decision-level influence. In sum, we present evidence that audiovisual delays can be an ordinal cue to visual distance.

“Audiovisual Delay as a Novel Cue to Visual Distance” by Philip Jaekl, Jakob Seidlitz, Laurence R. Harris, and Duje Tadin in PLOS ONE. Published online October 9 2015 doi:10.1371/journal.pone.0141125

Feel free to share this neuroscience article.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.