Brainwaves Synchronize to the Speed of Talking, Influencing the Way We Hear Words

Summary: According to researchers, the speed at which a person speaks influences the way we hear and understand upcoming words. A new study reports our brain waves synchronize to speech rate, influencing how we hear and perceive words.

Source: Max Planck Institute.

Have you ever found yourself finishing someone else’s sentences, even though you don’t really know them that well? Fortunately, the ability to predict what someone is going to say next isn’t the preserve of turtle doves or those in long-term relationships. Our brain processes all kinds of information to estimate what’s going to come next, and the speed at which the speaker is talking, or speech rate, plays an important role.

This study, published in the journal Current Biology, delved deeper to find out what happens on a neural level. “The findings show neural dynamics predict the timing of future speech items based on past speech rate, and this influences how ongoing words are heard,” says Anne Kösem of the MPI and the Lyon Neuroscience Research Center, and first author of the research paper.

Speech rhythms and perception

“We asked native Dutch participants to listen to Dutch sentences that suddenly changed in speech rates: the beginning of the sentence was either compressed or expanded in duration, leading to a fast or a slow speech rate, while the final three words were consistently presented at the original recorded speech rate.”

The final word of the sentence contained an ambiguous vowel, which could be interpreted, for example, as either a short “a” or a long “aa” vowel. Crucially, the speed of the beginning of the sentence could influence the way this ambiguous vowel is heard, leading to the perception of words with radically different meanings. For example, in Dutch, the ambiguous word is more likely to be perceived as a long “aa” word when someone was initially talking quickly (e.g. taak, word “task” in Dutch), and as a short “a” when someone was talking slowly (tak, “branch” in Dutch).

Participants reported how they perceived the last word of the sentence. The team recorded participants’ brain activity with magnetoencephalography (MEG) while they listened to the sentences, and investigated whether neural activity synchronised to the initial speech rate and whether that influenced how participants comprehended the last word.

Just like riding a bike

The study showed that our brain keeps following past speech rhythms after a change in speech rate. If it synchronizes to the preceding slow speech rate we are more likely to hear the last ambiguous word with a short vowel, and if it synchronises to the preceding fast speech rate we are more likely to hear a long vowel word. “Our findings suggest that the neural tracking of speech dynamics is a predictive mechanism, which directly influences perception,” adds Kösem.

people talking
The study showed that our brain keeps following past speech rhythms after a change in speech rate. image is in the public domain.

“Imagine the brain acting like a bicycle wheel. The wheel turns at the speed imposed by pedalling, but it continues rolling for some time after pedalling has stopped because it is dependent on the past pedalling speed.” This sustained synchronisation between brainwaves and speech rate helps us predict the length of future syllables, ultimately causally influencing the way we process and hear words.

Fundamental research with future potential

The team believes that, in the future, these findings may help researchers improve speech perception in adverse listening conditions and for the hearing impaired. Kösem added: “One follow-up study currently being performed tests if word perception can be modulated by directly modifying brain oscillatory activity with transcranial alternating current stimulation.”

About this neuroscience research article

This study was a collaboration between scientists from the Max Planck Institute for Psycholinguistics (MPI), the Donders Institute for Brain, Cognition, and Behaviour in Nijmegen, the Lyon Neuroscience Research Center and the University of Birmingham.

Source: Max Planck Institute
Publisher: Organized by
Image Source: image is in the public domain.
Original Research: Open access research for “Neural Entrainment Determines the Words We Hear” by Anne Kösem, Hans Rutger Bosker, Atsuko Takashima, Antje Meyer, Ole Jensen, and Peter Hagoort in Current Biology. Published September 8 2018.

Cite This Article

[cbtabs][cbtab title=”MLA”]Max Planck Institute”Brainwaves Synchronize to the Speed of Talking, Influencing the Way We Hear Words.” NeuroscienceNews. NeuroscienceNews, 8 September 2018.
<>.[/cbtab][cbtab title=”APA”]Max Planck Institute(2018, September 8). Brainwaves Synchronize to the Speed of Talking, Influencing the Way We Hear Words. NeuroscienceNews. Retrieved September 8, 2018 from[/cbtab][cbtab title=”Chicago”]Max Planck Institute”Brainwaves Synchronize to the Speed of Talking, Influencing the Way We Hear Words.” (accessed September 8, 2018).[/cbtab][/cbtabs]


Neural Entrainment Determines the Words We Hear

Low-frequency neural entrainment to rhythmic input has been hypothesized as a canonical mechanism that shapes sensory perception in time. Neural entrainment is deemed particularly relevant for speech analysis, as it would contribute to the extraction of discrete linguistic elements from continuous acoustic signals. Yet, its causal influence in speech perception has been difficult to establish. Here, we provide evidence that oscillations build temporal predictions about the duration of speech tokens that directly influence perception. Using magnetoencephalography (MEG), we studied neural dynamics during listening to sentences that changed in speech rate. We observed neural entrainment to preceding speech rhythms persisting for several cycles after the change in rate. The sustained entrainment was associated with changes in the perceived duration of the last word’s vowel, resulting in the perception of words with radically different meanings. These findings support oscillatory models of speech processing, suggesting that neural oscillations actively shape speech perception.

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.