Discovering the Basics of ‘Active Touch’

Summary: Researchers identify neurons that help us sense touch and motion.

Source: Johns Hopkins Medicine.

Study in mice identifies neurons that sense touch and motion, a combo needed to actively perceive the external world.

Working with genetically engineered mice — and especially their whiskers — Johns Hopkins researchers report they have identified a group of nerve cells in the skin responsible for what they call “active touch,” a combination of motion and sensory feeling needed to navigate the external world. The discovery of this basic sensory mechanism, described online April 20 in the journal Neuron, advances the search for better “smart” prosthetics for people, ones that provide more natural sensory feedback to the brain during use.

Study leader Daniel O’Connor, Ph.D., assistant professor of neuroscience at the Johns Hopkins University School of Medicine, explains that over the past several decades, researchers have amassed a wealth of knowledge about the sense of touch. “You can open up textbooks and read all about the different types of sensors or receptor cells in the skin,” he says. “However, almost everything we know is from experiments where tactile stimulation was applied to the stationary skin–in other words, passive touch.”

Such “passive touch,” O’Connor adds, isn’t how humans and other animals normally explore their world. For example, he says, people entering a dark room might search for a light switch by actively feeling the wall with their hands. To tell if an object is hard or soft, they’d probably need to press it with their fingers. To see if an object is smooth or rough, they’d scan their fingers back and forth across an object’s surface.

Each of these forms of touch combined with motion, he says, is an active way of exploring the world, rather than waiting to have a touch stimulus presented. They each also require the ability to sense a body part’s relative position in space, an ability known as proprioception.

While some research has suggested that the same populations of nerve cells, or neurons, might be responsible for sensing both proprioception and touch necessary for this sensory-motor integration, whether this was true and which neurons accomplish this feat have been largely unknown, O’Connor says.

To find out more, O’Connor and his team developed an experimental system with mice that allowed them to record electrical signals from specific neurons located in the skin, during both touch and motion.

The researchers accomplished this, they report, by working with members of a laboratory led by David Ginty, Ph.D., a former Johns Hopkins University faculty member, now at Harvard Medical School, to develop genetically altered mice. In these animals, a type of sensory neuron in the skin called Merkel afferents were mutated so that they responded to touch — their “native” stimulus, and one long documented in previous research — but also to blue light, which skin nerve cells don’t normally respond to.

The scientists trained the rodents to run on a mouse-sized treadmill that had a small pole attached to the front that was motorized to move to different locations. Before the mice started running, the researchers used their touch-and-light sensitized system to find a single Merkel afferent near each animal’s whiskers and used an electrode to measure the electrical signals from this neuron.

Much like humans use their hands to explore the world through touch, mice use their whiskers, explains O’Connor. Consequently, as the animals began running on the treadmill, they moved their whiskers back and forth in a motion that researchers call “exploratory whisking.”

Using a high-speed camera focused on the animals’ whiskers, the researchers took nearly 55,000,000 frames of video while the mice ran and whisked. They then used computer-learning algorithms to separate the movements into three different categories: when the rodents weren’t whisking or in contact with the pole; when they were whisking with no contact; or when they were whisking against the pole.

They then connected each of these movements — using video snapshots captured 500 times every second — to the electrical signals coming from the animals’ blue-light-sensitive Merkel afferents.

The results show that the Merkel afferents produced action potentials — the electrical spikes that neurons use to communicate with each other and the brain — when their associated whiskers contacted the pole. That finding wasn’t particularly surprising, O’Connor says, because of these neurons’ well-established role in touch.

However, he says, the Merkel afferents also responded robustly when they were moving in the air without touching the pole. By delving into the specific electrical signals, the researchers discovered that the action potentials precisely related to a whisker’s position in space. These findings suggest that Merkel afferents play a dual role in touch and proprioception, and in the sensory-motor integration necessary for active touch, O’Connor says.

Image shows a drawing of a mouse.
Touch is normally active, resulting from self-motion. Merkel afferents (top right, green and magenta) “fired” action potentials (white) during both movement and touch as mice ran on a treadmill and moved their whiskers to explore. Merkel afferents were identified by genetically altering them to fire when exposed to blue light (bottom left, bolt), not just to movement and touch. NeuroscienceNews.com image is credited to Daniel O’Connor, Ph.D., Johns Hopkins University School of Medicine.

Although these findings are particular to mouse whiskers, he cautions, he and his colleagues believe that Merkel afferents in humans could serve a similar function, because many anatomical and physiological properties of Merkel afferents appear similar across a range of species, including mice and humans.

Besides shedding light on a basic biological question, O’Connor says, his team’s research could also eventually improve artificial limbs and digits. Some prosthetics are now able to interface with the human brain, allowing users to move them using directed brain signals. While this motion is a huge advance beyond traditional static prosthetics, it still doesn’t allow the smooth movement of natural limbs. By integrating signals similar to those produced by Merkel afferents, he explains, researchers might eventually be able to create prosthetics that can send signals about touch and proprioception to the brain, allowing movements akin to native limbs.

About this neuroscience research article

Other Johns Hopkins researchers who participated in this study include Kyle S. Severson, Duo Xu, Margaret Van de Loo, and Ling Bai.

Funding: Funding for this work was provided by the National Institutes for Health under grant numbers R01NS34814 and P30NS050274. O’Connor is supported by the Whitehall Foundation, Klingenstein Fund and the National Institutes of Health under grant number R01NS089652.

Source: Beatriz Vianna – Johns Hopkins Medicine
Image Source: NeuroscienceNews.com image is credited to Daniel O’Connor, Ph.D., Johns Hopkins University School of Medicine.
Original Research: Abstract for “Active Touch and Self-Motion Encoding by Merkel Cell-Associated Afferents” by Kyle S. Severson, Duo Xu, Margaret Van de Loo, Ling Bai, David D. Ginty, Daniel H. O’Connor in Neuron. Published online April 20 2017 doi:10.1016/j.neuron.2017.03.045

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]Johns Hopkins Medicine “Discovering the Basics of ‘Active Touch'” NeuroscienceNews. NeuroscienceNews, 20 April 2017.
<https://neurosciencenews.com/motion-touch-neurons-6463 />.[/cbtab][cbtab title=”APA”]Johns Hopkins Medicine (2017, April 20). Discovering the Basics of ‘Active Touch’. NeuroscienceNew. Retrieved April 20, 2017 from https://neurosciencenews.com/motion-touch-neurons-6463 /[/cbtab][cbtab title=”Chicago”]Johns Hopkins Medicine “Discovering the Basics of ‘Active Touch’.” https://neurosciencenews.com/motion-touch-neurons-6463 / (accessed April 20, 2017).[/cbtab][/cbtabs]


Abstract

Active Touch and Self-Motion Encoding by Merkel Cell-Associated Afferents

Highlights
•Recordings from identified Merkel afferents during active touch
•Touch and position (phase within whisk cycle) are both encoded
•A simple mechanical model accounts for Merkel afferent spiking
•Phase coding depends on both external and internal forces

Summary
Touch perception depends on integrating signals from multiple types of peripheral mechanoreceptors. Merkel-cell associated afferents are thought to play a major role in form perception by encoding surface features of touched objects. However, activity of Merkel afferents during active touch has not been directly measured. Here, we show that Merkel and unidentified slowly adapting afferents in the whisker system of behaving mice respond to both self-motion and active touch. Touch responses were dominated by sensitivity to bending moment (torque) at the base of the whisker and its rate of change and largely explained by a simple mechanical model. Self-motion responses encoded whisker position within a whisk cycle (phase), not absolute whisker angle, and arose from stresses reflecting whisker inertia and activity of specific muscles. Thus, Merkel afferents send to the brain multiplexed information about whisker position and surface features, suggesting that proprioception and touch converge at the earliest neural level.

“Active Touch and Self-Motion Encoding by Merkel Cell-Associated Afferents” by Kyle S. Severson, Duo Xu, Margaret Van de Loo, Ling Bai, David D. Ginty, Daniel H. O’Connor in Neuron. Published online April 20 2017 doi:10.1016/j.neuron.2017.03.045

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.