Sense of Touch Improves Control of Robotic Arm

Summary: Adding a BCI that evokes tactile sensations makes it easier for users to manipulate and use robotic arm prosthetics.

Source: University of Pittsburgh

Most able-bodied people take their ability to perform simple daily tasks for granted—when they reach for a warm mug of coffee, they can feel its weight and temperature and adjust their grip accordingly so that no liquid is spilled. People with full sensory and motor control of their arms and hands can feel that they’ve made contact with an object the instant they touch or grasp it, allowing them to start moving or lifting it with confidence. 

But those tasks become much more difficult when a person operates a prosthetic arm, let alone a mind-controlled one.

In a paper published today in Science, a team of bioengineers from the University of Pittsburgh Rehab Neural Engineering Labs describe how adding brain stimulation that evokes tactile sensations makes it easier for the operator to manipulate a brain-controlled robotic arm. In the experiment, supplementing vision with artificial tactile perception cut the time spent grasping and transferring objects in half, from a median time of 20.9 to 10.2 seconds.

“In a sense, this is what we hoped would happen—but perhaps not to the degree that we observed,” said co-senior author Jennifer Collinger, Ph.D., associate professor in the Pitt Department of Physical Medicine and Rehabilitation. “Sensory feedback from limbs and hands is hugely important for doing normal things in our daily lives, and when that feedback is lacking, people’s performance is impaired.”

Study participant Nathan Copeland, whose progress was described in the paper, is the first person in the world who was implanted with tiny electrode arrays not just in his brain’s motor cortex but in his somatosensory cortex as well—a region of the brain that processes sensory information from the body. Arrays allow him to not only control the robotic arm with his mind, but also to receive tactile sensory feedback, which is similar to how neural circuits operate when a person’s spinal cord is intact.

“I was already extremely familiar with both the sensations generated by stimulation and performing the task without stimulation. Even though the sensation isn’t ‘natural’—it feels like pressure and gentle tingle—that never bothered me,” said Copeland. “There wasn’t really any point where I felt like stimulation was something I had to get used to. Doing the task while receiving the stimulation just went together like PB&J.”

After a car crash that left him with limited use of his arms, Copeland enrolled in a clinical trial testing the sensorimotor microelectrode brain-computer interface (BCI) and was implanted with four microelectrode arrays developed by Blackrock Microsystems (also commonly referred to as Utah arrays). 

This paper is a step forward from an earlier study that described for the first time how stimulating sensory regions of the brain using tiny electrical pulses can evoke sensation in distinct regions of a person’s hand, even though they lost feeling in their limbs due to spinal cord injury.

In this new study, the researchers combined reading the information from the brain to control the movement of the robotic arm with writing information back in to provide sensory feedback.

In a series of tests, where the BCI operator was asked to pick up and transfer various objects from a table to a raised platform, providing tactile feedback through electrical stimulation allowed the participant to complete tasks twice as fast compared to tests without stimulation. 

This shows two robotic arms
In a series of tests, where the BCI operator was asked to pick up and transfer various objects from a table to a raised platform, providing tactile feedback through electrical stimulation allowed the participant to complete tasks twice as fast compared to tests without stimulation. Image is in the public domain

In the new paper, the researchers wanted to test the effect of sensory feedback in conditions that would resemble the real world as closely as possible.

“We didn’t want to constrain the task by removing the visual component of perception,” said co-senior author Robert Gaunt, Ph.D., associate professor in the Pitt Department of Physical Medicine and Rehabilitation. “When even limited and imperfect sensation is restored, the person’s performance improved in a pretty significant way. We still have a long way to go in terms of making the sensations more realistic and bringing this technology to people’s homes, but the closer we can get to recreating the normal inputs to the brain, the better off we will be.”

Additional authors of this study include Sharlene Flesher, Ph.D., Jeffrey Weiss, M.S., Christopher Hughes, M.S., Angelica Herrera, B.S., and Michael Boninger, M.D., all of Pitt; John Downey, Ph.D., of the University of Chicago; and Elizabeth Tyler-Kabara, M.D., of the University of Texas at Austin.

Funding: This work was supported by the Defense Advanced Research Projects Agency (DARPA) and Space and Naval Warfare Systems Center Pacific (SSC Pacific) under Contract No. N66001-16-C-4051 and the Revolutionizing Prosthetics program (Contract No. N66001-10-C-4056). 

About this robotics research news

Source: University of Pittsburgh
Contact: Anastasia (Ana) Gorelova – University of Pittsburgh
Image: The image is in the public domain

Original Research: Closed access.
A brain-computer interface that evokes tactile sensations improves robotic arm control” by Jennifer Collinger et al. Science


Abstract

A brain-computer interface that evokes tactile sensations improves robotic arm control

Prosthetic arms controlled by a brain-computer interface can enable people with tetraplegia to perform functional movements. However, vision provides limited feedback because information about grasping objects is best relayed through tactile feedback.

We supplemented vision with tactile percepts evoked using a bidirectional brain-computer interface that records neural activity from the motor cortex and generates tactile sensations through intracortical microstimulation of the somatosensory cortex.

This enabled a person with tetraplegia to substantially improve performance with a robotic limb; trial times on a clinical upper-limb assessment were reduced by half, from a median time of 20.9 to 10.2 seconds.

Faster times were primarily due to less time spent attempting to grasp objects, revealing that mimicking known biological control principles results in task performance that is closer to able-bodied human abilities.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.