Study points to the evolutionary and developmental similarities between sensory cells in the inner ear and skin.
Artificial IntelligenceDeep LearningFeaturedMachine LearningNeuroscienceNeuroscience VideosNeurotechRobotics··3 min read
Combining deep learning algorithms with robotic engineering, researchers have developed a new robot able to combine vision and touch.
Using hydroxy-α-sanshoo, a bioactive compound of Szechuan pepper, researchers gain new insight into how the brain detects and perceives touch.
Study reveals grammar is evident and widespread in communication based on tactile interaction. The findings reveal that if one or more linguistic channel is unavailable, such as hearing or vision, structures will find another way to create formal categories.
Study reveals people touch the areas of their partner's body that mirror the parts of their body they enjoy having touched. A strong correlation was also drawn between touch and gaze, suggesting the parts of the body people like to be touched on aligned closely to those they liked to be looked at.
Micro-saccades, or tiny eye movements, can be used as an index of our ability to anticipate relevant information in the environment, independent of the information's sensory modality.