Mastering Motion: How Our Brains Predict to Perfect the Catch

Summary: Researchers shed new light on our brain’s ability to predict movement, aiding in seemingly simple tasks like grabbing a moving object.

The study found an intricate coordination between vision and motor skills, suggesting that the brain can anticipate movement even with an 80-millisecond delay in response time. This crucial understanding could improve our comprehension of various neurological disorders marked by visuomotor control problems.

The research may lead to advanced computational behavior analysis strategies to study these disorders.

Key Facts:

  1. The ability to predict movement is integral to effectively grabbing moving objects.
  2. Despite an 80-millisecond delay in visuomotor behavior, the primates successfully captured moving crickets, indicating their ability to predict motion.
  3. The findings could improve our understanding of neurological disorders related to visuomotor control issues and help in developing precise behavioral analysis methods.

Source: University of Rochester

Have you ever made a great catch—like saving a phone from dropping into a toilet or catching an indoor cat from running outside? Those skills—the ability to grab a moving object—takes precise interactions within and between our visual and motor systems.

Researchers at the Del Monte Institute for Neuroscience at the University of Rochester have found that the ability to visually predict movement may be an important part of the ability to make a great catch—or grab a moving object.

This shows a person catching a ball.
Using data of both the primates and the crickets the researchers were able to build a detailed model of vision guided reaching behavior. Credit: Neuroscience News

“We were able to develop a method that allowed us to analyze behaviors in a natural environment with high precision, which is important because, as we showed, behavioral patterns differ in a controlled setting,” said Kuan Hong Wang, PhD, a Dean’s Professor of Neuroscience at the University of Rochester Medical Center. Wang led the study out today in Current Biology in collaboration with Jude Mitchell, PhD, assistant professor of Brain and Cognitive Sciences at the University of Rochester, and Luke Shaw, a graduate student in the Neuroscience Graduate Program at the School of Medicine & Dentistry at the University of Rochester.

“Understanding how natural behaviors work will give us better insight into what is going awry in an array of neurological disorders.”

Researchers used multiple high-speed cameras and DeepLabCut—an AI method that uses video data to find key points on the hand and arm to measure movements—to record where the primate is looking and the movement of the arm and hand as it reaches and catches moving crickets.

Researchers found an 80-millisecond delay in the animal’s visuomotor behavior—the moment when vision and movement click and work together to direct the hand toward the target. Despite this measurable delay, the primates still grabbed the crickets, meaning that they had to predict the cricket’s movement.

Using data of both the primates and the crickets the researchers were able to build a detailed model of vision guided reaching behavior.

“These findings allow us to identify unique behavioral control strategies for mechanistic studies and engineering applications,” said Wang.

“Visuomotor control problems exist in many neurological disorders due to brain lesions, stroke, and genetic factors. This research may help develop computational behavior analysis strategies to precisely characterize behavioral alterations in naturalistic settings and understand their underlying causes.”

Funding: This work was supported by the National Institute of Health, the Schmitt Program of Integrative Neuroscience, and the Del Monte Institute for Neuroscience Pilot Program.

About this neuroscience research news

Author: Kelsie Smith Hayduk
Source: University of Rochester
Contact: Kelsie Smith Hayduk – University of Rochester
Image: The image is credited to Neuroscience News

Original Research: Closed access.
Fast prediction in marmoset reach-to-grasp movements for dynamic prey” by Kuan Hong Wang et al. Current Biology


Abstract

Fast prediction in marmoset reach-to-grasp movements for dynamic prey

Highlights

  • Video tracking marmoset reach-to-grasp behaviors in a cricket-hunting task
  • Marmosets open grasp aperture earlier for faster velocity reaches
  • Short visuo-motor delays in marmoset reaching for moving crickets
  • Marmosets predict moving target positions to compensate for visuo-motor delays

Summary

Primates have evolved sophisticated, visually guided reaching behaviors for interacting with dynamic objects, such as insects, during foraging. 

Reaching control in dynamic natural conditions requires active prediction of the target’s future position to compensate for visuo-motor processing delays and to enhance online movement adjustments. 

Past reaching research in non-human primates mainly focused on seated subjects engaged in repeated ballistic arm movements to either stationary targets or targets that instantaneously change position during the movement. 

However, those approaches impose task constraints that limit the natural dynamics of reaching. A recent field study in marmoset monkeys highlights predictive aspects of visually guided reaching during insect prey capture among wild marmoset monkeys. 

To examine the complementary dynamics of similar natural behavior within a laboratory context, we developed an ecologically motivated, unrestrained reach-to-grasp task involving live crickets. We used multiple high-speed video cameras to capture the movements of common marmosets (Callithrix jacchus) and crickets stereoscopically and applied machine vision algorithms for marker-free object and hand tracking.

Contrary to estimates under traditional constrained reaching paradigms, we find that reaching for dynamic targets can operate at incredibly short visuo-motor delays around 80 ms, rivaling the speeds that are typical of the oculomotor systems during closed-loop visual pursuit. 

Multivariate linear regression modeling of the kinematic relationships between the hand and cricket velocity revealed that predictions of the expected future location can compensate for visuo-motor delays during fast reaching. These results suggest a critical role of visual prediction facilitating online movement adjustments for dynamic prey.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.