Human Brain Tunes in to Visual Rhythms in Sign Language

Summary: A sign language study helps researchers better understand how the brain processes language.

Source: University of Chicago.

The human brain works in rhythms and cycles. These patterns occur at predictable frequencies that depend on what a person is doing and on what part of the brain is active during the behavior.

Similarly, there are rhythms and patterns out in the world, and for the last 20 years, scientists have been perplexed by the brain’s ability to “entrain,” or match up, with these patterns. Language is one of those areas in which scientists observe neural entrainment: When people listen to speech, their brain waves lock up with the volume-based rhythms they hear. Since people can’t pay attention to everything happening in their environment at once, this phase locking is thought to help anticipate when important information is likely to appear.

Many studies have documented this phenomenon in language processing; however, it has been difficult to tell whether neural entrainment is specialized for spoken language. In a new study in the Proceedings of the National Academy of Sciences, University of Chicago scholars designed an experiment using sign language to answer that question.

“To determine if neural entrainment to language is specialized for speech or if it is a general-purpose tool that humans can use for anything that is temporally predictable, we had to go outside of speech and outside of auditory perception,” said Geoffrey Brookshire, the study’s lead author and a PhD student in the Department of Psychology.

Brookshire worked with Daniel Casasanto, assistant professor of psychology and leader of the Experience and Cognition Lab; Susan Goldin-Meadow, the Beardsley Ruml Distinguished Service Professor of in the Department of Psychology and an acclaimed scholar of language and gesture; Howard Nusbaum, professor of psychology and an expert in spoken language and language use; and Jenny Lu, a PhD student specializing in sign language, gesture and language development.

“By looking at sign, we’re learning something about how the brain processes language more generally. We’re solving a mystery we couldn’t crack by studying speech alone,” Casasanto said.

In speech, the brain locks on to syllables, words and phrases, and those rhythms occur below 8 Hz, or 8 pulses per second. Vision also has a preferred frequency onto which it latches.

“When we focus on random flashes of light, for example, our brains most enthusiastically lock on to flashes around 10 Hz. By looking at sign language, we can ask whether the important thing for entrainment is which sense you’re using, or the kind of information you’re getting,” Brookshire said.

Measuring visual rhythms

To determine if people tune into visual rhythms in the same way they tune into the auditory rhythms of language, they showed videos of stories told in American Sign Language to fluent signers and measured brain activity as they watched. Once the researchers had these electroencephalogram readings, they needed a way to measure visual rhythms in sign language.

While there are well-established methods to measure rhythms in speech, there are no automatic, objective equivalents for the temporal structure of sign language. So the researchers created one.

They developed a new metric, called the instantaneous visual change, which summarizes the degree of change at each time period during signing. They ran experiment videos, the ones watched by participants, through their new algorithm to identify peaks and valleys in visual changes between frames. The largest peaks were associated with large, quick movements.

With this roadmap illustrating the magnitude of visual changes over time in the videos, Brookshire overlaid the participants’ EEGs to see whether people entrain around the normal visual frequency of about 10 Hz, or at the lower frequencies of signs and phrases in sign language—about 2 Hz.

Their discovery answers a fundamental question that has been lingering for years in research on speech entrainment: Is it specialized for auditory speech? The study reveals that the brain entrains depending on the information in the signal—not on the differences between seeing and hearing. Participants’ brain waves locked into the specific frequencies of sign language, rather than locking into the higher frequency that vision tends to prefer.

“This is an exciting finding because scientists have been theorizing for years about how adaptable or flexible entrainment may be, but we were never sure if it was specific to auditory processing or if it was more general purpose,” Brookshire said. “This study suggests that humans have the ability to follow perceptual rhythms and make temporal predictions in any of our senses.”

Image shows a person signing.
UChicago scholars designed an experiment using sign language to tell whether neural entrainment is specialized for spoken language. NeuroscienceNews.com image is adapted from the University of Chicago news release.

In a broader sense, neuroscientists want to understand how the human brain creates and perceives language, and entrainment has emerged as an important mechanism. In revealing neural entrainment as a generalized strategy for improving sensitivity to informational peaks, this study takes significant steps toward advancing the understanding of human language and perception.

“The piece of the paper that I find particularly exciting is that it compares how signers and non-signers process American Sign Language stimuli,” Goldin-Meadow said. “Although both groups showed the same level of entrainment in early visual regions, they displayed differences in frontal regions––this finding sets the stage for us to identify aspects of neural entrainment that are linked to the physical properties of the visual signal compared to aspects that appear only with linguistic knowledge.”

About this neuroscience research article

Source: Tina A. Cormier – University of Chicago
Image Source: NeuroscienceNews.com image is adapted from the University of Chicago news release.
Original Research: Full open access research for “Visual cortex entrains to sign language” by Geoffrey Brookshire, Jenny Lu, Howard C. Nusbaum, Susan Goldin-Meadow, and Daniel Casasanto in PNAS. Published online May 30 2017 doi:10.1073/pnas.1620350114

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]University of Chicago “Human Brain Tunes in to Visual Rhythms in Sign Language.” NeuroscienceNews. NeuroscienceNews, 31 May 2017.
<https://neurosciencenews.com/sign-language-rhythm-6804/>.[/cbtab][cbtab title=”APA”]University of Chicago (2017, May 31). Human Brain Tunes in to Visual Rhythms in Sign Language. NeuroscienceNew. Retrieved May 31, 2017 from https://neurosciencenews.com/sign-language-rhythm-6804/[/cbtab][cbtab title=”Chicago”]University of Chicago “Human Brain Tunes in to Visual Rhythms in Sign Language.” https://neurosciencenews.com/sign-language-rhythm-6804/ (accessed May 31, 2017).[/cbtab][/cbtabs]


Abstract

Visual cortex entrains to sign language

Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow (<<8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at ∼∼1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.

“Visual cortex entrains to sign language” by Geoffrey Brookshire, Jenny Lu, Howard C. Nusbaum, Susan Goldin-Meadow, and Daniel Casasanto in PNAS. Published online May 30 2017 doi:10.1073/pnas.1620350114

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.