This shows a brain.
The researchers used this musculoskeletal modeling to generate muscle spindle signals in the upper limb to generate a collection of “large-scale, naturalistic movement repertoire”. Credit: Neuroscience News

Deciphering Proprioception: How the Brain Maps Movement

Summary: A new study reveals the mechanisms behind proprioception, our body’s innate ability to sense limb position and movement, critical for movement without visual cues. Utilizing musculoskeletal simulations and neural network models, researchers have advanced our understanding of how the brain integrates sensory data from muscle spindles to perceive bodily position and motion.

This study suggests that the brain prioritizes limb position and velocity in processing proprioceptive input. The findings, which could revolutionize neuroprosthetics, demonstrate the importance of task-driven modeling in uncovering the computational principles underlying sensory processing.

Key Facts:

  1. Innovative Approach to Proprioception: The study employed musculoskeletal modeling and neural network models to simulate naturalistic muscle spindle signals, offering new insights into how the brain perceives limb position and movement.
  2. Task-Driven Neural Network Models: By training neural network models on computational tasks reflecting proprioceptive processing, researchers found that predicting limb position and velocity were key tasks that shaped “brain-like” representations.
  3. Implications for Neuroprosthetics: Understanding proprioceptive processing at this level opens new possibilities for enhancing neuroprosthetic design, aiming for more natural and intuitive limb control.

Source: EPFL

How does your brain know the position and movement of your different body parts? The sense is known as proprioception, and it is something like a “sixth sense”, allowing us to move freely without constantly watching our limbs.

Proprioception involves a complex network of sensors embedded in our muscles that relay information about limb position and movement back to our brain. However, little is known about how the brain puts together the different signals it receives from muscles.

A new study led by Alexander Mathis at EPFL now sheds light on the question by exploring how our brains create a cohesive sense of body position and movement. Published in Cell, the study was carried out by PhD students Alessandro Marin Vargas, Axel Bisi, and Alberto Chiappa, with experimental data from Chris Versteeg and Lee Miller at Northwestern University.

“It is widely believed that sensory systems should exploit the statistics of the world and this theory could explain many properties of the visual and auditory system,” says Mathis. “To generalize this theory to proprioception, we used musculoskeletal simulators to compute the statistics of the distributed sensors.”

The researchers used this musculoskeletal modeling to generate muscle spindle signals in the upper limb to generate a collection of “large-scale, naturalistic movement repertoire”.

They then used this repertoire to train thousands of “task-driven” neural network models on sixteen computational tasks, each of which reflects a scientific hypothesis about the computations carried out by the proprioceptive pathway, which includes parts of the brainstem and somatosensory cortex.

The approach allowed the team to comprehensively analyse how different neural network architectures and computational tasks influence the development of “brain-like” representations of proprioceptive information.

They found that neural network models trained on tasks that predict limb position and velocity were most effective, suggesting that our brains prioritize integrating the distributed muscle spindle input to understand body movement and position.

The research highlights the potential of task-driven modeling in neuroscience. Unlike traditional methods that focus on predicting neural activity directly, task-driven models can offer insights into the underlying computational principles of sensory processing.

The research also paves the way for new experimental avenues in neuroscience, since a better understanding of proprioceptive processing could lead to significant advancements in neuroprosthetics, with more natural and intuitive control of artificial limbs.

About this proprioception and brain mapping research news

Author: Nik Papageorgiou
Source: EPFL
Contact: Nik Papageorgiou – EPFL
Image: The image is credited to Neuroscience News

Original Research: Open access.
Task-driven neural network models predict neural dynamics of proprioception” by Alexander Mathis et al. Cell


Abstract

Task-driven neural network models predict neural dynamics of proprioception

Highlights

  • We combine motion capture, biomechanics, and representation learning
  • Computational task training is used to test hypotheses of proprioceptive coding
  • Task-driven models predict neural activity better than linear and data-driven models
  • Computational task performance correlates with neural explained variance

Summary

Proprioception tells the brain the state of the body based on distributed sensory neurons. Yet, the principles that govern proprioceptive processing are poorly understood.

Here, we employ a task-driven modeling approach to investigate the neural code of proprioceptive neurons in cuneate nucleus (CN) and somatosensory cortex area 2 (S1).

We simulated muscle spindle signals through musculoskeletal modeling and generated a large-scale movement repertoire to train neural networks based on 16 hypotheses, each representing different computational goals.

We found that the emerging, task-optimized internal representations generalize from synthetic data to predict neural dynamics in CN and S1 of primates. Computational tasks that aim to predict the limb position and velocity were the best at predicting the neural activity in both areas.

Since task optimization develops representations that better predict neural activity during active than passive movements, we postulate that neural activity in the CN and S1 is top-down modulated during goal-directed movements.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.