When Sight Meets Sound: How the Brain Binds Audiovisual Memories

Summary: Our brains don’t just remember what we see and hear, they fuse the two into cohesive memories. A new study reveals that when visual and auditory speech cues are synchronized, they trigger rhythmic brain activity that strengthens memory formation.

If these cues are out of sync, this synchronized brain activity weakens, resulting in less effective memory recall. The research highlights how timing between senses may shape how vividly we remember events involving both sound and sight.

Key Facts:

  • Synchronized Input Boosts Memory: Simultaneous visual (lip movements) and auditory (speech sounds) cues enhanced brain oscillations linked to stronger memory formation.
  • Timing Matters: When speech sounds lagged behind lip movements, memory-related brain activity diminished both during viewing and recall.
  • Neural Phase Alignment: Effective audiovisual memory integration may depend on whether sensory inputs align within the same phase of brain oscillations.

Source: SfN

When a person remembers their friend telling them a funny story, they associate the sound of that friend talking with the appearance of that friend speaking and laughing.

How does the human brain form audiovisual memories like this?

In a new Journal of Neuroscience paper, Emmanuel Biau, from the University of Liverpool, and colleagues addressed this question by exploring brain activity linked to forming memories that integrate sounds and visual information. 

This shows a brain.
This oscillatory activity reoccurred when participants remembered the movie clips later. Credit: Neuroscience News

The researchers elicited memories in study participants by presenting them with movie clips of people speaking. They manipulated when sounds and visual information were presented in the movie clips to explore the impact on brain activity during memory recall.

Movie clips with speech sounds and lip movements occurring at the same time were linked to oscillatory activity in two brain regions during viewing of the clips. This oscillatory activity reoccurred when participants remembered the movie clips later.

But movie clips with speech sounds lagging behind lip movements reduced oscillatory activity during viewing as well as during memory recall.

Says Biau, “We assume that if auditory and visual speech inputs arrive in the brain at the same time, then their chance of being associated in a memory is much higher because they will fall into the same phase of neural activity, which is not the case for asynchronous stimuli.”

According to the authors, their work suggests that the oscillatory activity in these two brain regions may play a crucial role in integrating auditory and visual information during memory recall, though more work is needed to confirm this. 

About this audiovisual memory research news

Author: SfN Media
Source: SfN
Contact: SfN Media – SfN
Image: The image is credited to Neuroscience News

Original Research: Closed access.
Neocortical and Hippocampal Theta Oscillations Track Audiovisual Integration and Replay of Speech Memories” by Emmanuel Biau et al. Journal of Neuroscience


Abstract

Neocortical and Hippocampal Theta Oscillations Track Audiovisual Integration and Replay of Speech Memories

Are you talkin’ to me?!

If you ever watched the masterpiece “Taxi driver” directed by Martin Scorsese, you certainly recall the monologue during which Travis Bickle rehearses an imaginary confrontation in front of a mirror.

While remembering this scene, you recollect a myriad of speech features across visual and auditory senses with a smooth sensation of unified memory.

The aim of this study was to investigate how the fine-grained synchrony between coinciding visual and auditory features impacts brain oscillations when forming multisensory speech memories.

We developed a memory task presenting participants with short synchronous or asynchronous movie clips focused on the face of speakers in real interviews, all the while undergoing magnetoencephalography (MEG) recording.

In the synchronous condition, the natural alignment between visual and auditory onsets was kept intact.

In the asynchronous condition, auditory onsets were delayed to present lip movements and speech sounds in antiphase specifically with respect to the theta oscillation synchronising them in the original movie.

Our results first showed that theta oscillations in the neocortex and hippocampus were modulated by the level of synchrony between lip movements and syllables during audiovisual speech perception.

Second, theta asynchrony between the lip movements and auditory envelope during audiovisual speech perception reduced the accuracy of subsequent theta oscillation reinstatement during memory recollection.

We conclude that neural theta oscillations play a pivotal role in both audiovisual integration and memory replay of speech.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.