Why do we sometimes have trouble paying attention?

Summary: Patterns of functional connectivity reliably predict when people were more or less focused on a task.

Source: University of Chicago

How much of this page will you read? How much will you remember? And does it make a difference when you’re reading, or where?

Those are the sorts of questions that a University of Chicago neuroscientist asks in an innovative new study—one that examines brain scans to uncover how attention is sustained over time, and when it might fluctuate.

“Maybe in general, we’re pretty good at paying attention, or maybe we struggle—but it’s not the same all the time,” said lead author Monica Rosenberg, an assistant professor in UChicago’s Department of Psychology. “We wanted to build a model that could predict a person’s attentional state based on what we see in their brain scans.”

Published in the Proceedings of the National Academy of Sciences, the study relies on functional MRI data collected for this study as well as data from previous research, combining the results of 107 individuals from five different data sets. By using what Rosenberg calls “green science”—replicating results in data collected for other purposes—the study expands its pool of participants beyond what is usually found in a single lab.

The research examines functional MRI scans of people who performed a computerized task multiple times in one day—watching a stream of images and pressing a button in response to some them—as well as those who performed the same task on different days. It also examines brain scans of those who have been administered anesthesia, as well as 30 scans of a single individual over the course of 10 months. The participants’ ages ranged from 18 to 56.

“If we want to build brain-based models that are applicable in clinical or translational settings, they have to be able to generalize across data sets,” said Rosenberg, an expert on attention. “It has to be the case that models don’t just predict behavior from data collected on a single hospital scanner from a single group of individuals.

“If a model can’t predict something about people across different sites and populations, it’s less practically useful.”

Prior research has found that every person has a unique pattern of functional brain connectivity—a sort of fingerprint that can predict their cognitive and attentional abilities.

This shows people in a movie theatre
Using brain scans, a new study proposes a model that can predict when someone is paying closer attention—and when their attention might fluctuate. Image is adapted from the University of Chigaco news release.

Rosenberg and her co-authors—including scholars from Yale University and the University of Florida—tested whether those patterns could extend to predict how a person’s attention changes from moment to moment, or day to day.

They found that patterns of functional brain connectivity reliably predicted when people were more and less focused on the computer task. These predictions were highly accurate when averaged across many scan sessions. However, the patterns still predicted attentional state even when measured in short window of time, such as 30 seconds of an fMRI session.

Previous studies have historically used single data sets, due in part to the high cost of fMRI.

“It’s only in the past couple of years that sharing data sets has become much more common,” Rosenberg said. “That’s what gives us access to a wider variety of samples, which allow us to ask how general our models are.”

Rosenberg hopes further research can provide insights into how attention changes over longer periods of time, like development and aging.

She is also in the process of testing whether predictive models can translate to settings outside the lab. For example, her lab is asking whether patterns of functional brain connectivity can predict attention fluctuations as we listen to a story or watch a movie.

“When we collect brain data in an MRI scanner,” she said, “we often give people psychological tasks that involve seeing pictures and pressing buttons. That’s really not how we navigate the world.”

About this neuroscience research article

Source:
University of Chicago
Media Contacts:
Jack Wang – University of Chicago
Image Source:
The image is adapted from the University of Chigaco news release.

Original Research: Closed access
“Functional connectivity predicts changes in attention observed across minutes, days, and months”. Rosenberg et al.
PNAS doi:10.1073/pnas.1912226117.

Abstract

Functional connectivity predicts changes in attention observed across minutes, days, and months

The ability to sustain attention differs across people and changes within a single person over time. Although recent work has demonstrated that patterns of functional brain connectivity predict individual differences in sustained attention, whether these same patterns capture fluctuations in attention within individuals remains unclear. Here, across five independent studies, we demonstrate that the sustained attention connectome-based predictive model (CPM), a validated model of sustained attention function, generalizes to predict attentional state from data collected across minutes, days, weeks, and months. Furthermore, the sustained attention CPM is sensitive to within-subject state changes induced by propofol as well as sevoflurane, such that individuals show functional connectivity signatures of stronger attentional states when awake than when under deep sedation and light anesthesia. Together, these results demonstrate that fluctuations in attentional state reflect variability in the same functional connectivity patterns that predict individual differences in sustained attention.

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.