Touch Sensing Neurons are Multitaskers

Two types of touch information — the feel of an object and the position of an animal’s limb — have long been thought to flow into the brain via different channels and be integrated in sophisticated processing regions. Now, with help from a specially devised mechanical exoskeleton that positioned monkeys’ hands in different postures, Johns Hopkins researchers have challenged that view. In a paper published in the April 22 issue of Neuron, they present evidence that the two types of information are integrated as soon as they reach the brain by sense-processing brain cells once thought to be incapable of such higher-order thought.

Past studies have indicated that the feel of an object against the skin and position information of the hands and fingers were processed separately in the sensory system’s first-line sense processors, and then passed along to more sophisticated brain regions to be integrated. But it was a challenge to reliably differentiate brain activity caused by the two inputs from one another and from the brain’s commands to muscles, says Manuel Gomez-Ramirez, Ph.D. , an assistant research scientist at The Johns Hopkins University. To solve that problem, Steven Hsiao, Gomez-Ramirez’s late mentor, and his colleagues developed a machine that positions a monkey’s hand and delivers stimuli to its fingers.

In this experiment, then-graduate student Sung Soo Kim, Ph.D., now a research specialist at the Howard Hughes Medical Institute, trained monkeys to perform an unrelated visual task while their hands were manipulated by the machine, which moved their fingers slightly from side to side and up and down at precise angles. The machine also pressed a plastic bar to the monkeys’ fingertips in different orientations. By monitoring the monkeys’ brains in real time, the research group saw that the position and touch information were conveyed through the same cells in the somatosensory cortex.

This image shows four childrens hands. In two of the hands are hard looking objects.
To perceive objects we touch, our brains must integrate tactile information with an awareness of the position of our fingers. Image credit: Sung Soo Kim.

“This study changes our understanding of how position and touch signals are combined in the brain,” says Gomez-Ramirez. This information could be used in efforts to better integrate prostheses with patients’ brains so that they behave more like natural limbs, he notes.

“Holding objects and exploring the world with our hands requires integrating many sensory signals in the brain and continuously supplying this information to motor areas so that it can issue the appropriate commands for holding objects,” Gomez-Ramirez says. “Our understanding of how these processes occur is very limited, and Steve Hsiao spent a lot of time thinking about the problem and figuring out how to test it.”

About this neuroscience research

Pramodsingh H. Thakur of The Johns Hopkins University was also an author on the paper.

Funding: Funding provided by NIH/National Institute of Neurological Disorders and Stroke, Samsung Scholarship Foundation.

Source: Shawna Williams – Johns Hopkins Medicine
Image Source: The image is credited to Sung Soo Kim
Original Research: Abstract for “Multimodal Interactions between Proprioceptive and Cutaneous Signals in Primary Somatosensory Cortex” by Sung Soo Kim, Manuel Gomez-Ramirez, Pramodsingh H. Thakur, and Steven S. Hsiao in Neuron. Published online April 9 2015 doi:10.1016/j.neuron.2015.03.020


Abstract

Multimodal Interactions between Proprioceptive and Cutaneous Signals in Primary Somatosensory Cortex

Highlights

•All subareas of SI respond to proprioceptive and cutaneous inputs
•Proprioceptive inputs from the hand are encoded by two distinct neural populations
•Proprioceptive and cutaneous inputs integrate via linear or nonlinear mechanisms

Summary

The classical view of somatosensory processing holds that proprioceptive and cutaneous inputs are conveyed to cortex through segregated channels, initially synapsing in modality-specific areas 3a (proprioception) and 3b (cutaneous) of primary somatosensory cortex (SI). These areas relay their signals to areas 1 and 2 where multimodal convergence first emerges. However, proprioceptive and cutaneous maps have traditionally been characterized using unreliable stimulation tools. Here, we employed a mechanical stimulator that reliably positioned animals’ hands in different postures and presented tactile stimuli with superb precision. Single-unit recordings in SI revealed that most neurons responded to cutaneous and proprioceptive stimuli, including cells in areas 3a and 3b. Multimodal responses were characterized by linear and nonlinear effects that emerged during early (∼20 ms) and latter (> 100 ms) stages of stimulus processing, respectively. These data are incompatible with the modality specificity model in SI, and provide evidence for distinct mechanisms of multimodal processing in the somatosensory system.

“Multimodal Interactions between Proprioceptive and Cutaneous Signals in Primary Somatosensory Cortex” by Sung Soo Kim, Manuel Gomez-Ramirez, Pramodsingh H. Thakur, and Steven S. Hsiao in Neuron. Published online April 9 2015 doi:10.1016/j.neuron.2015.03.020

Enjoyed this neuroscience article? Share it with a friend
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.