Summary: Researchers discovered that the ear emits subtle sounds in response to eye movements, allowing them to pinpoint where someone is looking.
The study demonstrates that these ear sounds, potentially caused by muscle contractions or hair cell activations, can reveal eye positions.
This discovery challenges existing beliefs about ear function, suggesting that ear sounds might help synchronize sight and sound perception. The team’s innovative approach could lead to new clinical hearing tests and a deeper understanding of sensory integration.
Key Facts:
- The research uncovered that subtle ear sounds correspond to eye movements, providing insight into where a person is looking.
- This phenomenon is likely caused by the brain’s coordination of eye movements with ear muscle contractions or hair cell activations.
- The findings open possibilities for new clinical tests and a better understanding of how the brain integrates visual and auditory information.
Source: Duke University
Scientists can now pinpoint where someoneโs eyes are looking just by listening to their ears.
โYou can actually estimate the movement of the eyes, the position of the target that the eyes are going to look at, just from recordings made with a microphone in the ear canal,โ saidย Jennifer Groh, Ph.D.,ย senior author of the new report, and a professor in the departments of psychology & neuroscience as well as neurobiology at Duke University.
In 2018, Grohโs team discovered that the ears make a subtle, imperceptible noise when the eyes move. In a new report appearing the week of November 20 in the journal Proceedings of the National Academy of Sciences, the Duke team now shows that these sounds can reveal where your eyes are looking.
It also works the other way around. Just by knowing where someone is looking, Groh and her team were able to predict what the waveform of the subtle ear sound would look like.
These sounds, Groh believes, may be caused when eye movements stimulate the brain to contract either middle ear muscles, which typically help dampen loud sounds, or the hair cells that help amplify quiet sounds.
The exact purpose of these ear squeaks is unclear, but Grohโs initial hunch is that it might help sharpen peopleโs perception.
โWe think this is part of a system for allowing the brain to match up where sights and sounds are located, even though our eyes can move when our head and ears do not,โ Groh said.
Understanding the relationship between subtle ear sounds and vision might lead to the development of new clinical tests for hearing.
โIf each part of the ear contributes individual rules for the eardrum signal, then they could be used as a type of clinical tool to assess which part of the anatomy in the ear is malfunctioning,โ said Stephanie Lovich, one of the lead authors of the paper and a graduate student in psychology & neuroscience at Duke.
Just as the eyeโs pupils constrict or dilate like a cameraโs aperture to adjust how much light gets in, the ears too have their own way to regulate hearing. Scientists long thought that these sound-regulating mechanisms only helped to amplify soft sounds or dampen loud ones.
But in 2018, Groh and her team discovered that these same sound-regulating mechanisms were also activated by eye movements, suggesting that the brain informs the ears about the eyeโs movements.
In their latest study, the research team followed up on their initial discovery and investigated whether the faint auditory signals contained detailed information about the eye movements.
To decode peopleโs ear sounds, Grohโs team at Duke and Professor Christopher Shera, Ph.D. from the University of Southern California, recruited 16 adults with unimpaired vision and hearing to Grohโs lab in Durham to take a fairly simple eye test.
Participants looked at a static green dot on a computer screen, then, without moving their heads, tracked the dot with their eyes as it disappeared and then reappeared either up, down, left, right, or diagonal from the starting point. This gave Grohโs team a wide-range of auditory signals generated as the eyes moved horizontally, vertically, or diagonally.
An eye tracker recorded where participantโs pupils were darting to compare against the ear sounds, which were captured using a microphone-embedded pair of earbuds.
The research team analyzed the ear sounds and found unique signatures for different directions of movement. This enabled them to crack the ear soundโs code and calculate where people were looking just by scrutinizing a soundwave.
โSince a diagonal eye movement is just a horizontal component and vertical component, my labmate and co-author David Murphy realized you can take those two components and guess what they would be if you put them together,โ Lovich said.
โThen you can go in the opposite direction and look at an oscillation to predict that someone was looking 30 degrees to the left.โ
Groh is now starting to examine whether these ear sounds play a role in perception.
One set of projects is focused on how eye-movement ear sounds may be different in people with hearing or vision loss.
Groh is also testing whether people who donโt have hearing or vision loss will generate ear signals that can predict how well they do on a sound localization task, like spotting where an ambulance is while driving, which relies on mapping auditory information onto a visual scene.
โSome folks have a really reproducible signal day-to-day, and you can measure it quickly,โ Groh said. โYou might expect those folks to be really good at a visual-auditory task compared to other folks, where it’s more variable.โ
Funding: Grohโs research was supported by a grant from the National Institutes of Health (NIDCD DC017532).
About this visual and auditory neuroscience research news
Author: Dan Vahaba
Source: Duke University
Contact: Dan Vahaba – Duke University
Image: The image is credited to Neuroscience News
Original Research: Open access.
“Parametric Information About Eye Movements is Sent to the Ears” byย Jennifer Groh et al. PNAS
Abstract
Parametric Information About Eye Movements is Sent to the Ears
When the eyes move, the alignment between the visual and auditory scenes changes. We are not perceptually aware of these shiftsโwhich indicates that the brain must incorporate accurate information about eye movements into auditory and visual processing.
Here, we show that the small sounds generated within the ear by the brain contain accurate information about contemporaneous eye movements in the spatial domain: The direction and amplitude of the eye movements could be inferred from these small sounds.
The underlying mechanism(s) likely involve(s) the earโs various motor structures and could facilitate the translation of incoming auditory signals into a frame of reference anchored to the direction of the eyes and hence the visual scene.


