Mind over body: The search for stronger brain-computer interfaces

Summary: Machine learning algorithm allows a brain-computer interface to readjust itself continually in the background to ensure the system is always calibrated and ready to use.

Source: University of Pittsburgh

When people suffer debilitating injuries or illnesses of the nervous system, they sometimes lose the ability to perform tasks normally taken for granted, such as walking, playing music or driving a car. They can imagine doing something, but the injury might block that action from occurring.

Brain-computer interface systems exist that can translate brain signals into a desired action to regain some function, but they can be a burden to use because they don’t always operate smoothly and need readjustment to complete even simple tasks.

Researchers at the University of Pittsburgh and Carnegie Mellon University are working on understanding how the brain works when learning tasks with the help of brain-computer interface technology. In a set of papers, the second of which was published today in Nature Biomedical Engineering, the team is moving the needle forward on brain-computer interface technology intended to help improve the lives of amputee patients who use neural prosthetics.

“Let’s say during your work day, you plan out your evening trip to the grocery store,” said Aaron Batista, associate professor of bioengineering in Pitt’s Swanson School of Engineering. “That plan is maintained somewhere in your brain throughout the day, but probably doesn’t reach your motor cortex until you actually get to the store. We’re developing brain-computer interface technologies that will hopefully one day function at the level of our everyday intentions.”

Batista, Pitt postdoctoral research associate Emily Oby and the Carnegie Mellon researchers have collaborated on developing direct pathways from the brain to external devices. They use electrodes smaller than a hair that record neural activity and make it available for control algorithms.

In the team’s first study, published last June in the Proceedings of the National Academy of Sciences, the group examined how the brain changes with the learning of new brain-computer interface skills.

“When the subjects form a motor intention, it causes patterns of activity across those electrodes, and we render those as movements on a computer screen. The subjects then alter their neural activity patterns in a manner that evokes the movements that they want,” said project co-director Steven Chase, a professor of biomedical engineering at the Neuroscience Institute at Carnegie Mellon.

In the new study, the team designed technology whereby the brain-computer interface readjusts itself continually in the background to ensure the system is always in calibration and ready to use.

“We change how the neural activity affects the movement of the cursor, and this evokes learning,” said Pitt’s Oby, the study’s lead author. “If we changed that relationship in a certain way, it required that our animal subjects produce new patterns of neural activity to learn to control the movement of the cursor again. Doing so took them weeks of practice, and we could watch how the brain changed as they learned.”

In a sense, the algorithm “learns” how to adjust to the noise and instability that is inherent in neural recording interfaces. The findings suggest that the process for humans to master a new skill involves the generation of new neural activity patterns. The team eventually would like this technology to be used in a clinical setting for stroke rehabilitation.

This shows a brain
In the new study, the team designed technology whereby the brain-computer interface readjusts itself continually in the background to ensure the system is always in calibration and ready to use. The image is in the public domain.

Such self-recalibration procedures have been a long-sought goal in the field of neural prosthetics, and the method presented in the team’s studies is able to recover automatically from instabilities without requiring the user to pause to recalibrate the system by themselves.

“Let’s say that the instability was so large such that the subject was no longer able to control the brain-computer interface,” said Yu. “Existing self-recalibration procedures are likely to struggle in that scenario, whereas in our method, we’ve demonstrated it can in many cases recover from even the most dramatic instabilities.”

Both research projects were performed as part of the Center for the Neural Basis of Cognition. This cross-institutional research and education program leverages the strengths of Pitt in basic and clinical neuroscience and bioengineering with those of Carnegie Mellon in cognitive and computational neuroscience.

Other Carnegie Mellon collaborators on the projects include co-director Byron Yu, professor of electrical and computer engineering and biomedical engineering, and also postdoctoral researchers Alan Degenhart and William Bishop, who led the conduct of the research.

About this neuroscience research article

Source:
University of Pittsburgh
Media Contacts:
Amerigo Allegretto – University of Pittsburgh
Image Source:
The image is in the public domain.

Original Research: Closed access
“Stabilization of a brain–computer interface via the alignment of low-dimensional spaces of neural activity”. by Alan D. Degenhart, William E. Bishop, Emily R. Oby, Elizabeth C. Tyler-Kabara, Steven M. Chase, Aaron P. Batista & Byron M. Yu.
Nature Biomedical Engineering doi:10.1038/s41551-020-0542-9.

Abstract

Stabilization of a brain–computer interface via the alignment of low-dimensional spaces of neural activity

The instability of neural recordings can render clinical brain–computer interfaces (BCIs) uncontrollable. Here, we show that the alignment of low-dimensional neural manifolds (low-dimensional spaces that describe specific correlation patterns between neurons) can be used to stabilize neural activity, thereby maintaining BCI performance in the presence of recording instabilities. We evaluated the stabilizer with non-human primates during online cursor control via intracortical BCIs in the presence of severe and abrupt recording instabilities. The stabilized BCIs recovered proficient control under different instability conditions and across multiple days. The stabilizer does not require knowledge of user intent and can outperform supervised recalibration. It stabilized BCIs even when neural activity contained little information about the direction of cursor movement. The stabilizer may be applicable to other neural interfaces and may improve the clinical viability of BCIs.

Feel Free To Share This Neurotech News.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.