This shows a woman with a head made of books.
However, the way people integrate semantic information while reading remains undetermined. Credit: Neuroscience News

The Brain’s Reading Riddle: Dual Brain Regions Unlock Language’s Depths

Summary: New research illuminates the brain’s role in semantic integration during reading, providing insights into the challenges faced by aphasia patients.

The study found that the posterior temporal cortex activates early during semantic processing, while the inferior frontal cortex plays a broader role in understanding meaning.

By exploring how the brain deduces meaning from context, such as understanding ‘apple’ from ‘a round red fruit’, researchers shed light on the difficulties people with aphasia encounter when making semantic inferences.

Key Facts:

  1. The brain relies on the posterior temporal cortex for initial semantic processing and the inferior frontal cortex for broader comprehension.
  2. The study utilized intracranial recordings in epilepsy patients to observe how the brain derives meaning from phrases.
  3. Insights from this study offer a deeper understanding of semantic deficits in aphasia, especially after frontal strokes.

Source: UT Houston

Two different regions of the brain are critical to integrating semantic information while reading, which could shed more light on why people with aphasia have difficulty with semantics, according to new research from UTHealth Houston.

The study, led by first author Elliot Murphy, PhD, postdoctoral research fellow in the Vivian L. Smith Department of Neurosurgery with McGovern Medical School at UTHealth Houston, and senior author Nitin Tandon, MD, professor and chair ad interim of the department in the medical school, was published today in Nature Communications.

Language depends largely on the integration of vocabulary across multiple words to derive semantic concepts, including reference to events and objects, and statements of truth. However, the way people integrate semantic information while reading remains undetermined.

“Typically, we take pieces from different words and derive a meaning that’s separate. For example, one of the definitions in our study was ‘a round red fruit’ — the word ‘apple’ doesn’t appear in that sentence, but we wanted to know how patients made that inference,” Murphy said. “We were able to expose the dynamics of how the human brain integrates semantic information, and which areas come online at different stages.”

To uncover this, researchers studied intracranial recordings in 58 epilepsy patients who read written word definitions, which were either referential or nonreferential to a common object, as well as phrases that were either coherent (“a person at the circus who makes you laugh”) or incoherent (“a place where oceans shop”).

Sentences were presented on the screen one word at a time, and researchers focused their analysis over the time window when the final word in the sentence was presented.

Overall, they found that different areas of the language network showed sensitivity to meaning across a small window of rapidly cascading activity. Specifically, they discovered the existence of complementary cortical mosaics for semantic integration in two areas: the posterior temporal cortex and the inferior frontal cortex.

The posterior temporal cortex is activated early on in the semantic integration process, while the inferior frontal cortex is particularly sensitive to all aspects of meaning, especially in deep sulcal sites, or grooves in the folds of the brain.

Murphy said these findings can help illuminate the inner dynamics of aphasia, a disorder that affects a person’s ability to express and understand written and spoken language. It can occur suddenly after a stroke or head injury, or develop slowly from a growing brain tumor or disease.

People with aphasia often have difficulty with semantic integration, meaning that while they can understand individual words, they cannot make additional semantic inferences.

“Both the frontal and posterior temporal cortexes disrupt semantic integration, which we see happen in individuals with various aphasias,” Murphy said. “We speculate that this intricately designed mosaic structure makes some sense out of the varying semantic deficits people experience after frontal strokes.”

Co-authors with UTHealth Houston included Kathryn M. Snyder, MD/PhD student; and Patrick S. Rollo, research associate and third-year medical student, both with the Vivian L. Smith Department of Neurosurgery and the Texas Institute for Restorative Neurotechnologies (TIRN) at McGovern Medical School. Tandon is the Nancy, Clive and Pierce Runnels Distinguished Chair in Neuroscience of the Vivian L. Smith Center for Neurologic Research and the BCMS Distinguished Professor in Neurological Disorders and Neurosurgery with McGovern Medical School and a member of TIRN. Tandon is also a faculty member with The University of Texas MD Anderson UTHealth Houston Graduate School of Biomedical Sciences, where Snyder is also a student. Kiefer J. Forseth, MD, PhD, and Cristian Donos, PhD, both formerly with UTHealth Houston and now with the University of California at San Diego and the University of Bucharest in Romania, respectively, also contributed to the study.

About this information processing and language research news

Author: Jeannette Sanchez
Source: UT Houston
Contact: Jeannette Sanchez – UT Houston
Image: The image is credited to Neuroscience News

Original Research: Open access.
The spatiotemporal dynamics of semantic integration in the human brain” by Elliot Murphy et al. Nature Communications


Abstract

The spatiotemporal dynamics of semantic integration in the human brain

Language depends critically on the integration of lexical information across multiple words to derive semantic concepts. Limitations of spatiotemporal resolution have previously rendered it difficult to isolate processes involved in semantic integration.

We utilized intracranial recordings in epilepsy patients (n = 58) who read written word definitions. Descriptions were either referential or non-referential to a common object. Semantically referential sentences enabled high frequency broadband gamma activation (70–150 Hz) of the inferior frontal sulcus (IFS), medial parietal cortex, orbitofrontal cortex (OFC) and medial temporal lobe in the left, language-dominant hemisphere.

IFS, OFC and posterior middle temporal gyrus activity was modulated by the semantic coherence of non-referential sentences, exposing semantic effects that were independent of task-based referential status.

Components of this network, alongside posterior superior temporal sulcus, were engaged for referential sentences that did not clearly reduce the lexical search space by the final word.

These results indicate the existence of complementary cortical mosaics for semantic integration in posterior temporal and inferior frontal cortex.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.