What goes up may actually be down

Summary: Virtual reality study reveals people plan movements and anticipate the force of gravity by “seeing it” through visual cues rather than “feeling it.”

Source: Frontiers

Gravity is the unseen force that dominates our entire lives. It’s what makes walking uphill so difficult and what makes parts of our body eventually point downhill. It is unyielding, everywhere, and a force that we battle with every time we make a move. But exactly how do people account for this invisible influence while moving through the world?

A new study in Frontiers in Neuroscience used virtual reality to determine how people plan their movements by “seeing” gravity using visual cues in the landscape around them, rather than “feeling it” through changes in weight and balance. PhD Student Desiderio Cano Porras, who worked in Dr. Meir Plotnik’s laboratory at the Sheba Medical Center, Israel and colleagues found that our capability to anticipate the influence of gravity relies on visual cues in order for us to walk safely and effectively downhill and uphill.

In order to determine the influence of vision and gravity on how we move, the researchers recruited a group of 16 young, healthy adults for a virtual reality (VR) experiment. The researchers designed a VR environment that simulated level, uphill, and downhill walking. Participants were immersed in a large-scale virtual reality system in which they walked on a real-life treadmill that was at an upward incline, at a downward decline, or remained flat. Throughout the experiment, the VR visual environment either matched or didn’t match the physical cues that the participants experienced on the treadmill.

Using this setup, the researchers were able to disrupt the visual and physical cues we all experience when anticipating going uphill or downhill. So, when participants saw a downhill environment in the VR visual scenery, they positioned their bodies to begin “braking” to go downhill despite the treadmill actually remaining flat or at an upward incline. They also found the reverse – people prepared for more “exertion” to go uphill in the VR environment even though the treadmill remained flat or was pointing downhill.

The researchers showed that purely visual cues caused people to adjust their movements to compensate for predicted gravity-based changes (i.e., braking in anticipation of a downhill gravity boost and exertion in anticipation of uphill gravitational resistance). However, while participants initially relied on their vision, they quickly adapted to the real-life treadmill conditions using something called a “sensory reweighting mechanism” that reprioritized body-based cues over visual ones. In this way the participants were able to overcome the sensory mismatch and keep walking.

“Our findings highlight multisensory interactions: the human brain usually gets information about forces from “touch” senses; however, it generates behavior in response to gravity by “seeing” it first, without initially “feeling” it,” says Dr. Plotnik.

This shows a woman jumping
Social anxiety can be a highly debilitating psychiatric disorder with negative impacts on the individual’s relationships and working life. Image is in the public domain.

Dr. Plotnik also states that the study is an exciting application of new and emerging VR tech as “many new digital technologies, in particular virtual reality, allow a high level of human-technology interactions and immersion. We leveraged this immersion to explore and start to disentangle the complex visual-locomotor integration achieved by human sensory systems.”

The research is a step towards the broader goal of understanding the intricate pathways that people use to decide how and when to move their bodies, but there is still work to be done.

Dr. Plotnik states that “This study is only a ‘snapshot’ of a specific task involving transitioning to uphill or downhill walking. In the future we will explore the neuronal mechanisms involved and potential clinical implications for diagnosis and treatment.”

About this neuroscience research article

Source:
Frontiers
Media Contacts:
Michael Becker – Frontiers
Image Source:
The image is in the public domain.

Original Research: Open access
“Seeing Gravity: Gait Adaptations to Visual and Physical Inclines – A Virtual Reality Study”. Desiderio Cano Porras, Gabriel Zeilig, Glen M. Doniger, Yotam Bahat, Rivka Inzelberg and Meir Plotnik.
Frontiers in Neuroscience doi:10.3389/fnins.2019.01308.

Abstract

Seeing Gravity: Gait Adaptations to Visual and Physical Inclines – A Virtual Reality Study

Using advanced virtual reality technology, we demonstrate that exposure to virtual inclinations visually simulating inclined walking induces gait modulations in a manner consistent with expected gravitational forces (i.e., acting upon a free body), suggesting vision-based perception of gravity. The force of gravity critically impacts the regulation of our movements. However, how humans perceive and incorporate gravity into locomotion is not well understood. In this study, we introduce a novel paradigm for exposing humans to incongruent sensory information under conditions constrained by distinct gravitational effects, facilitating analysis of the consistency of human locomotion with expected gravitational forces. Young healthy adults walked under conditions of actual physical inclinations as well as virtual inclinations. We identify and describe ‘braking’ and ‘exertion’ effects – locomotor adaptations accommodating gravito-inertial forces associated with physical inclines. We show that purely visual cues (from virtual inclinations) induce consistent locomotor adaptations to counter expected gravity-based changes, consistent with indirect prediction mechanisms. Specifically, downhill visual cues activate the braking effect in anticipation of a gravitational boost, whereas uphill visual cues promote an exertion effect in anticipation of gravitational deceleration. Although participants initially rely upon vision to accommodate environmental changes, a sensory reweighting mechanism gradually reprioritizes body-based cues over visual ones. A high-level neural model outlines a putative pathway subserving the observed effects. Our findings may be pivotal in designing virtual reality-based paradigms for understanding perception and action in complex environments with potential translational benefits.

Feel free to share this Virtual Reality News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.