Summary: Speaking requires a precise sequence of muscle movements, long thought to be coordinated by Broca’s area in the brain. New research reveals that a different region, the middle precentral gyrus (mPrCG), plays a key role in planning and executing speech sequences.
Using brain recordings and stimulation during surgery, scientists showed that mPrCG activity increases with speech complexity and errors arise when it’s disrupted. This discovery opens new avenues for understanding speech disorders and developing assistive communication technologies.
Key Facts:
- The mPrCG helps string together speech sounds into words, a role previously attributed to Broca’s area.
- Activity in the mPrCG scales with the complexity of spoken syllable sequences.
- Disrupting the mPrCG during speech mimics apraxia, confirming its critical role.
Source: UCSF
Speaking is one of the most complicated things a human can do. Before you even say a word, your brain has to translate what you want to say into a perfectly sequenced set of instructions to the dozens of muscles you use to speak.
For more than a century, scientists thought all this planning and coordination — called speech-motor sequencing — happened in a part of the frontal lobe called Broca’s area.
Now, a new study from UC San Francisco shows that it relies on a much wider network of neurons across many brain areas. This network is centered in an area called the middle precentral gyrus, or mPrCG, which scientists thought might only control the larynx, a part of the vocal tract that helps us make high- or low-pitched sounds.
“It turns out that this part of the brain has a much more interesting and important role,” said Edward Chang, MD, Chair of Neurosurgery, and senior author of the study. “It strings together the sounds of speech to form words, which is crucial to being able pronounce them.”
The study, which appears July 16 in Nature Human Behaviour, could inspire new ways of looking at speech disorders, aid in the development of devices that allow paralyzed people to communicate, and help preserve a patient’s ability to speak after brain surgery.
Beyond Broca’s area
Broca’s area, named for physiologist Pierre Paul Broca, who discovered it in 1860, has been believed to handle most of our language processing. That encompasses both how we make sense of language we hear or read, and how we produce the words we intend to say.
But several years ago, Chang, who is a member of the UCSF Weill Institute for Neurosciences and has spent over a decade exploring the question of how the brain produces speech, began to suspect that it involves areas beyond Broca’s.
In a rare case study, he’d seen that when a patient had a tumor removed from their mPrCG, they developed apraxia of speech, a condition in which people know what they want to say but struggle to coordinate the movements needed to say it clearly. The same condition didn’t result from similar surgeries in Broca’s area.
Chang and then-graduate student Jessie Liu, PhD also noticed activity associated with speech planning in the mPrCG while developing a device to allow people with paralysis to communicate.
To investigate what was happening, Chang, Liu, and postdoctoral scholar Lingyun Zhao, PhD, worked with 14 volunteers undergoing brain surgery as part of their treatment for epilepsy. Each patient had a thin mesh of electrodes placed on the surface of their brain, which could record brain signals happening just before they spoke their words.
Neurosurgeons like Chang routinely use these electrodes to help them map where in the patient’s brain the seizures are happening. If there are speech areas nearby, the surgeon will map those as well, to avoid damaging them during surgery.
Liu and Zhou were able to piggyback on the technology to see what was happening in the mPrCG when patients were talking.
They showed the volunteers sets of syllables and words on a screen and then asked them to make the sounds out loud. Some sets were simple repeated syllables, like “ba-ba-ba,” while others included more complex sequences, like “ba-da-ga,” that contain a variety of sounds.
The researchers saw that when they gave participants more complex sequences, the mPrCG was more active than when participants were given simple ones. The team also found that the increase in activity in that region predicted how quickly the participants would begin speaking after they read the words.
“Seeing this combination — working harder to plan more complex sequences and then signaling muscles to put the plan into action — tells us that even though the mPrCG is outside of Broca’s area, it’s critical to orchestrating how we speak,” Liu said.
Connecting intention to action
The team also used the electrodes to stimulate the mPrCG in five of the study participants while they were uttering set sequences of syllables.
If the sequences were fairly simple, the participants had no problem. But when they were given more complex sequences, the stimulation caused participants to make errors resembling the apraxia of speech Chang saw in his case study.
That adds more evidence that the mPrCG is central to coordinating multiple different speech sounds and acts as a bridge connecting what a person wants to say with the actions that are required to say it.
“It’s playing this vital role that had been thought to belong to Broca’s area but didn’t quite fit there,” Liu said. “This points us in a new research direction, where learning how the mPrCG does this will lead us to a new understanding of how we speak.”
Funding: This work was funded by the NIH (R01-DC012379) and philanthropy.
About this language and neuroscience research news
Author: Robin Marks
Source: UCSF
Contact: Robin Marks – UCSF
Image: The image is credited to Neuroscience News
Original Research: Closed access.
“Speech sequencing in the human precentral gyrus” by Edward Chang et al. Nature Human Behavior
Abstract
Speech sequencing in the human precentral gyrus
Fluent speech production is mediated by serially ordering and preparing motor plans corresponding to target speech sounds, a process known as speech-motor sequencing.
Here we used high-density direct cortical recordings while 14 participants spoke utterances with varying phonemic and syllabic sequence complexity after reading a target sequence and a delay period.
Phasic activations corresponding to speech production and auditory feedback were observed, but also sustained neural activity that persisted throughout all task phases including the target presentation, the delay period and production of the sequence.
Furthermore, sustained activity in a specific area, the middle precentral gyrus (mPrCG), was both modulated by sequence complexity and predicted reaction time, suggesting a role in speech-motor sequencing.
Electrocortical stimulation of the mPrCG caused speech disfluencies resembling those seen in apraxia of speech.
These results suggest that speech-motor sequencing is mediated by a distributed cortical network in which the mPrCG plays a central role.