How Spatial Navigation Correlates With Language

Summary: Researchers have identified a brain mechanism that appears to support navigation systems used in both linguistic and spatial tasks.

Source: National Research University Higher School of Economics.

Cognitive neuroscientists from the Higher School of Economics and Aarhus University experimentally demonstrate how spatial navigation impacts language comprehension. The results of the study have been published in NeuroImage.

Language is a complicated cognitive function, which is performed not only by local brain modules, but by a distributed network of cortical generators. Physical experience such as movement and spatial motion play an important role in psychological experiences and cognitive function, which is related to how an individual mentally constructs the meaning of a sentence.

Nikola Vukovic and Yury Shtyrov carried out an experiment at the HSE Centre for Cognition & Decision Making, which explains the relations between the systems responsible for spatial navigation and language. Using neurophysiological data, they describe brain mechanisms that support navigation systems use in both spatial and linguistic tasks.

“When we read or hear stories about characters, we have to represent the inherently different perspectives people have on objects and events, and ‘put ourselves in their shoes’. Our study is the first to show that our brain mentally simulates sentence perspective by using non-linguistic areas typically in charge of visuo-spatial thought” says Dr. Nikola Vukovic, the scientist who was chiefly responsible for devising and running the experiment.

Previous studies have shown that humans have certain spatial preferences that are based either on one’s body (egocentric) or are independent from it (allocentric). Although not absolute and subject to change in various situations, these preferences define how an individual perceives the surrounding space and how they plan and understand navigation in this space.

The participants of the experiment solved two types of tasks. The first was a computer-based spatial navigation task involving movement through a twisting virtual tunnel, at the end of which they had to indicate the beginning of the tunnel. The shape of the tunnel was designed so that people with egocentric and allocentric perspectives estimated the starting point differently. This difference in their subjective estimates helped the researchers split the participants according to their reference frame predispositions.

The second task involved understanding simple sentences and matching them with pictures. The pictures differed in terms of their perspective, and the same story could be described using first (“I”) or second person pronouns (“You”). The participants had to choose which pictures best matched the situation described by the sentence.

During the experiment, electrical brain activity was recorded in all participants with the use of continuous electroencephalographic (EEG) data. Spectral perturbations registered by EEG demonstrated that a lot of areas responsible for navigation were active during the completion of both types of tasks. One of the most interesting facts for the researchers was that activation of areas when hearing the sentences also depended on the type of individual’s spatial preferences.

Image of different maps.
Virtual corridor navigation task. Illustrated on the top (1) are example frames from a right-turning corridor animation. At the end of each corridor, navigators were presented with a screen showing different arrows, and had to choose which one they thought pointed to the beginning of the corridor. (2) is a schematic of the possible corridor turns, including the eccentricity of the turn relative to the heading defined by the initial segment (?120°, ?90°, ?45°, 45°, 90°, 120°). The top-view head representations in (3) illustrate cognitive headings based on Allocentric (A) and Egocentric (E) frames during a right-turning (90°) corridor. Note that using an egocentric strategy will lead participants to perceive the corridor start behind them and to the right, so they would choose the right pointing arrow. Conversely, using an allocentric frame a participant would choose the left arrow in the same scenario, because their cognitive heading remains the same throughout the trial, and matches the heading of the initial corridor segment. NeuroscienceNews.com image is credited to Nikola Vukovic, Yury Shtyrov.

‘Brain activity when solving a language task is related to a individuals’ egocentric or allocentric perspective, as well as their brain activity in the navigation task. The correlation between navigation and linguistic activities proves that these phenomena are truly connected’, emphasized Yury Shtyrov, leading research fellow at the HSE Centre for Cognition & Decision Making and professor at Aarhus University, where he directs MEG/EEG research. ‘Furthermore, in the process of language comprehension we saw activation in well-known brain navigation systems, which were previously believed to make no contribution to speech comprehension’.

These data may one day be used by neurobiologists and health professionals. For example, in some types of aphasia, comprehension of motion-related words suffers, and knowledge on the correlation between the navigation and language systems in the brain could help in the search for mechanisms to restore these links.

About this neuroscience research article

Source: Liudmila Mezentseva – National Research University Higher School of Economics
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is credited to Nikola Vukovic, Yury Shtyrov.
Original Research: Abstract for “Cortical networks for reference-frame processing are shared by language and spatial navigation systems” by Nikola Vukovic and Yury Shtyrov in NeuroImage. Published online November 1 2017 doi:10.1016/j.neuroimage.2017.08.041

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]National Research University Higher School of Economics “How Spatial Navigation Correlates With Language.” NeuroscienceNews. NeuroscienceNews, 11 November 2017.
<https://neurosciencenews.com/language-spatial-navigation-7921/>.[/cbtab][cbtab title=”APA”]National Research University Higher School of Economics (2017, November 11). How Spatial Navigation Correlates With Language. NeuroscienceNews. Retrieved November 11, 2017 from https://neurosciencenews.com/language-spatial-navigation-7921/[/cbtab][cbtab title=”Chicago”]National Research University Higher School of Economics “How Spatial Navigation Correlates With Language.” https://neurosciencenews.com/language-spatial-navigation-7921/ (accessed November 11, 2017).[/cbtab][/cbtabs]


Abstract

Neuroactive Steroids and Affective Symptoms in Women Across the Weight Spectrum

To help us live in the three-dimensional world, our brain integrates incoming spatial information into reference frames, which are based either on our own body (egocentric) or independent from it (allocentric). Such frames, however, may be crucial not only when interacting with the visual world, but also in language comprehension, since even the simplest utterance can be understood from different perspectives. While significant progress has been made in elucidating how linguistic factors, such as pronouns, influence reference frame adoption, the neural underpinnings of this ability are largely unknown. Building on the neural reuse framework, this study tested the hypothesis that reference frame processing in language comprehension involves mechanisms used in navigation and spatial cognition. We recorded EEG activity in 28 healthy volunteers to identify spatiotemporal dynamics in (1) spatial navigation, and (2) a language comprehension task (sentence-picture matching). By decomposing the EEG signal into a set of maximally independent activity patterns, we localised and identified a subset of components which best characterised perspective-taking in both domains. Remarkably, we find individual co-variability across these tasks: people’s strategies in spatial navigation are also reflected in their construction of sentential perspective. Furthermore, a distributed network of cortical generators of such strategy-dependent activity responded not only in navigation, but in sentence comprehension. Thus we report, for the first time, evidence for shared brain mechanisms across these two domains – advancing our understanding of language’s interaction with other cognitive systems, and the individual differences shaping comprehension.

“Cortical networks for reference-frame processing are shared by language and spatial navigation systems” by Nikola Vukovic and Yury Shtyrov in NeuroImage. Published online November 1 2017 doi:10.1016/j.neuroimage.2017.08.041

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.