Summary: According to a new PNAS study, our eyes and ears team up to process the sites and sounds we experience.
Simply moving the eyes triggers the eardrums to move too, says a new study by Duke University neuroscientists.
The researchers found that keeping the head still but shifting the eyes to one side or the other sparks vibrations in the eardrums, even in the absence of any sounds.
Surprisingly, these eardrum vibrations start slightly before the eyes move, indicating that motion in the ears and the eyes are controlled by the same motor commands deep within the brain.
“It’s like the brain is saying, ‘I’m going to move the eyes, I better tell the eardrums, too,'” said Jennifer Groh, a professor of psychology and neuroscience at Duke.
The findings, which were replicated in both humans and rhesus monkeys, provide new insight into how the brain coordinates what we see and what we hear. It may also lead to new understanding of hearing disorders, such as difficulty following a conversation in a crowded room.
The paper appeared Jan. 23 in Proceedings of the National Academy of Sciences.
It’s no secret that the eyes and ears work together to make sense of the sights and sounds around us. Most people find it easier to understand somebody if they are looking at them and watching their lips move. And in a famous illusion called the McGurk Effect, videos of lip cues dubbed with mismatched audio cause people to hear the wrong sound.
But researchers are still puzzling over where and how the brain combines these two very different types of sensory information.
“Our brains would like to match up what we see and what we hear according to where these stimuli are coming from, but the visual system and the auditory system figure out where stimuli are located in two completely different ways,” said Groh, who holds a joint appointment in the department of neurobiology at Duke. “The eyes are giving you a camera-like snapshot of the visual scene, whereas for sounds, you have to calculate where they are coming from based on differences in timing and loudness across the two ears.”
Because the eyes are usually darting about within the head, the visual and auditory worlds are constantly in flux with respect to one another, Groh added.
In an experiment designed by Kurtis Gruters, a formal doctoral student in Groh’s lab and co-first author on the paper, 16 participants were asked to sit in a dark room and follow shifting LED lights with their eyes. Each participant also wore small microphones in their ear canals that were sensitive enough to pick up the slight vibrations created when the eardrum sways back and forth.
Though eardrums vibrate primarily in response to outside sounds, the brain can also control their movements using small bones in the middle ear and hair cells in the cochlea. These mechanisms help modulate the volume of sounds that ultimately reach the inner ear and brain, and produce small sounds known as otoacoustic emissions.
Gruters found that when the eyes moved, both eardrums moved in sync with one another, one side bulging inward at the same time the other side bulged outward. They continued to vibrate back and forth together until shortly after the eyes stopped moving. Eye movements in opposite directions produced opposite patterns of vibrations.
Larger eye movements also triggered bigger vibrations than smaller eye movements, the team found.
“The fact that these eardrum movements are encoding spatial information about eye movements means that they may be useful for helping our brains merge visual and auditory space,” said David Murphy, a doctoral student in Groh’s lab and co-first author on the paper. “It could also signify a marker of a healthy interaction between the auditory and visual systems.”
The team, which included Christopher Shera at the University of Southern California and David W. Smith of the University of Florida, is still investigating how these eardrum vibrations impact what we hear, and what role they may play in hearing disorders. In future experiments, they will look at whether up and down eye movements also cause unique signatures in eardrum vibrations.
“The eardrum movements literally contain information about what the eyes are doing,” Groh said. “This demonstrates that these two sensory pathways are coupled, and they are coupled at the earliest points.”
Source: Kara Manke – Duke
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is credited to Jessi Cruger and David Murphy, Duke University.
Original Research: Abstract in PNAS.
Robust prediction of individual creative ability from brain functional connectivity
Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.