Summary: A new study that investigates how the human brain deciphers sensory input could revolutionize robotics and neuroprosthetics.
Source: University of New South Wales.
UNSW neuroscientists have discovered a completely new understanding of how the brain deciphers neural inputs, which could transform the next generation of robotic prosthetics.
When our skin scans a surface, everything we feel is conveyed through the nerves by means of electrical impulses, received by neurons in the brain in signals like Morse code.
Neurophysiologists Ingvars Birznieks and Richard Vickery research the sense of touch, and how we can take so much information – like pressure, shape, texture, and vibration – from one signal.
They say their new findings represent a whole new way of looking at how our brains make judgements about the environment, and could have applications in telesurgery, prostheses and robotics.
Seven years ago, the pair began working with a pin array device called an optacon (optical to tactile converter). Designed to stimulate the fingertip, the device has 144 tiny pins which all move very fast. When moved over text, the pins move in synch with the letters, allowing a person with sight to teach a blind person to read.
“A unique unintended feature of this device is that one tap of the pin can generate a single electrical impulse in touch neurons. These impulses are what neurons use to communicate, so we could use it to interrogate the nervous system,” Dr Vickery says.
Dr Birznieks explains that they were now able to send a stream of code with whatever information they wanted.
“We could talk to the brain using its own language and see how it interprets the messages we sent to it. Using this dialogue we were in a position to learn the brain’s language or neural code.”
The researchers found the brain uses periods of “quiet” between the impulses to make judgements about the environment, and this flies in the face of the conventional view that says neural activity is the main driver of human perception.
So unexpected was the finding of this new coding strategy, they spent several years trying to disprove it before having the confidence to publish the work to the scientific community.
The paper “Spike timing matters in novel neuronal code involved in vibrotactile frequency perception” has been published online this month in Current Biology.
Dr Birznieks says the work has transformed the understanding of basic principles how our brains encode information.
“We expect this will re-write the textbooks,” he says.
Dr Vickery says the knowledge could help researchers build better brain machine interfaces and haptic devices, providing coding strategies that could add tactile function to robotics, for example.
The researchers are exploring the possibility of developing a method to be used to restore a sense of touch in amputees. It is expected that by manipulating timing of electrical impulse generation in the nerve, it will be possible to make people feel varied tactile sensation, that for example, becomes more intense or feels faster.
Source: University of New South Wales
Image Source: NeuroscienceNews.com image is adapted from the University of New South Wales news release.
Original Research: Abstract for “Spike Timing Matters in Novel Neuronal Code Involved in Vibrotactile Frequency Perception” by Ingvars Birznieks and Richard M. Vickery in Current Biology. Published online May 4 2017 doi:10.1016/j.cub.2017.04.011
Spike Timing Matters in Novel Neuronal Code Involved in Vibrotactile Frequency Perception
•Temporal spike patterns may shape frequency perception regardless of spike count
•Periodicity is not the most salient temporal cue for vibrotactile frequency
•Tactile frequency is determined by duration of the silent gap between spike bursts
•This new code is well suited to signal naturalistic complex vibratory patterns
Skin vibrations sensed by tactile receptors contribute significantly to the perception of object properties during tactile exploration and to sensorimotor control during object manipulation. Sustained low-frequency skin vibration (<60 Hz) evokes a distinct tactile sensation referred to as flutter whose frequency can be clearly perceived. How afferent spiking activity translates into the perception of frequency is still unknown. Measures based on mean spike rates of neurons in the primary somatosensory cortex are sufficient to explain performance in some frequency discrimination tasks; however, there is emerging evidence that stimuli can be distinguished based also on temporal features of neural activity. Our study’s advance is to demonstrate that temporal features are fundamental for vibrotactile frequency perception. Pulsatile mechanical stimuli were used to elicit specified temporal spike train patterns in tactile afferents, and subsequently psychophysical methods were employed to characterize human frequency perception. Remarkably, the most salient temporal feature determining vibrotactile frequency was not the underlying periodicity but, rather, the duration of the silent gap between successive bursts of neural activity. This burst gap code for frequency represents a previously unknown form of neural coding in the tactile sensory system, which parallels auditory pitch perception mechanisms based on purely temporal information where longer inter-pulse intervals receive higher perceptual weights than short intervals. Our study also demonstrates that human perception of stimuli can be determined exclusively by temporal features of spike trains independent of the mean spike rate and without contribution from population response factors.
“Endothelial TLR4 and the microbiome drive cerebral cavernous malformations” by Alan T. Tang, Jaesung P. Choi, Jonathan J. Kotzin, Yiqing Yang, Courtney C. Hong, Nicholas Hobson, Romuald Girard, Hussein A. Zeineddine, Rhonda Lightle, Thomas Moore, Ying Cao, Robert Shenkar, Mei Chen, Patricia Mericko, Jisheng Yang, Li Li, Ceylan Tanes, Dmytro Kobuley, Urmo Võsa, Kevin J. Whitehead, Dean Y. Li, Lude Franke, Blaine Hart, Markus Schwaninger, Jorge Henao-Mejia, Leslie Morrison, Helen Kim, Issam A. Awad, Xiangjian Zheng & Mark L. Kahn in Nature. Published online May 10 2017 doi:10.1038/nature22075