Summary: Bilingual people are better able to integrate sight and sound to make sense of speech, a new study reveals. Researchers report, in addition to altering basic sensory experiences, learning a second language can impact memory, decision making and cognitive control.
Source: Northwestern University.
Learning a second language can change the way our senses work together to interpret speech, according to a new Northwestern University study.
In the study, published today in the journal Brain Sciences, researchers found that bilingual people are better at integrating sight and hearing to make sense of speech.
“We find that language experience can change sensory perception,” said Viorica Marian, a professor of communication sciences and disorders and psychology at Northwestern University. “Our discovery is that bilinguals are more likely to integrate across auditory and visual senses.”
Specifically, when people hear a speech sound (e.g. “ba”) that conflicts with what they see (e.g. “ga”), they will often perceive a completely different sound (e.g. “da”). This illusion is called the “McGurk Effect,” and researchers found it is more likely to occur if you speak more than one language. This demonstrates that language experience can change the way we perceive the world around us.
A video demonstration of the “McGurk Effect” is available on the Bilingualism and Psycholinguistics Research Group website.
“A bilingual and monolingual listening to the same speaker can hear two completely different sounds, showing that language experience affects even the most basic cognitive processes,” said Sayuri Hayakawa, study co-author and post-doctoral research scientist.
Previous research demonstrated that multiple languages compete with each other in the brain, making it more difficult for bilinguals to process what they hear. As a result and out of necessity, they may rely more heavily on visual input to make sense of sound.
Bilingual experience can impact domains ranging from memory to decision making, to cognitive control, but these findings suggest that learning a second language can even change our basic sensory experiences.
Given that more than half of the world’s population is bilingual, educators and clinicians working with bilinguals should be aware of how language experience can change the way people process speech. This effect of bilingualism is also relevant for developers of technology related to speech recognition such as Siri and Alexa, as well as animators of CGI.
About this neuroscience research article
Source: Northwestern University Publisher: Organized by NeuroscienceNews.com. Image Source: NeuroscienceNews.com image is adapted from the Northwestern University news release. Original Research: Open access research for “Language Experience Changes Audiovisual Perception” by Viorica Marian, Sayuri Hayakawa, Tuan Q. Lam and Scott R. Schroeder in Brain Sciences. Published May 2018. doi:10.3390/brainsci8050085
Cite This NeuroscienceNews.com Article
[cbtabs][cbtab title=”MLA”]Northwestern University “Learning a Second Language Alters Sensory Perception.” NeuroscienceNews. NeuroscienceNews, 14 May 2018. <https://neurosciencenews.com/bilingual-sensory-perception-9039/>.[/cbtab][cbtab title=”APA”]Northwestern University (2018, May 14). Learning a Second Language Alters Sensory Perception. NeuroscienceNews. Retrieved May 14, 2018 from https://neurosciencenews.com/bilingual-sensory-perception-9039/[/cbtab][cbtab title=”Chicago”]Northwestern University “Learning a Second Language Alters Sensory Perception.” https://neurosciencenews.com/bilingual-sensory-perception-9039/ (accessed May 14, 2018).[/cbtab][/cbtabs]
Language Experience Changes Audiovisual Perception
Can experience change perception? Here, we examine whether language experience shapes the way individuals process auditory and visual information. We used the McGurk effect—the discovery that when people hear a speech sound (e.g., “ba”) and see a conflicting lip movement (e.g., “ga”), they recognize it as a completely new sound (e.g., “da”). This finding suggests that the brain fuses input across auditory and visual modalities demonstrates that what we hear is profoundly influenced by what we see. We find that cross-modal integration is affected by language background, with bilinguals experiencing the McGurk effect more than monolinguals. This increased reliance on the visual channel is not due to decreased language proficiency, as the effect was observed even among highly proficient bilinguals. Instead, we propose that the challenges of learning and monitoring multiple languages have lasting consequences for how individuals process auditory and visual information.