Psilocybin, a natural psychoactive compound currently being considered for the treatment of depression, alters people's emotional states while listening to music. While listening to music, those who used psilocybin reported enhanced emotional processing. Researchers say combining music with psychedelic therapy may have positive benefits for those suffering from depression.
As potential threats approach, rattlesnakes increase the rate of rattling. The switch to a high-frequency mode makes people think the snake is closer to them than it actually is.
It is well documented that animals perk up their ears when they hear a noise that captures their attention. A new study reveals humans also do the same. Researchers demonstrated humans make small, unconscious movements of their ears directed toward a sound that sparks attention.
A new study reveals the relationship between attentional state and emotions from pupillary reactions. Visual perception elicits emotions in all attentional state, while auditory perception elicits emotions only when attention is paid to sounds.
Artificial IntelligenceAuditory NeuroscienceDeep LearningFeaturedMachine LearningNeuroscienceNeurotechOpen Neuroscience Articles··4 min read
Using artificial intelligence and brain-computer interface technology, researchers reconstructed English words from neural signals recorded from the brains of non-human primates.
A new study reveals how the brain processes sound and how quickly neurons transition from processing the sound of speech to the language based words.
Researchers report different brain areas are activated when a guitarists and beatboxers hear previously unheard tracks by their instrument of choice. Beatboxers, researchers say, show increased activation in brain areas that control mouth movements, where as guitarists show activation in areas that control hand movements. The study sheds light on understanding brain areas involved in auditory perception.
A new study reports teenagers have a more difficult time discerning the emotional vocal tones of their peers. The results, researchers suggest, show teens have not reached a level of maturity in either ability to identify or express vocal emotions.
Researchers report the brain re-evaluates the interpretation of speech sounds the moment subsequent sounds are heard in order to update interpretations as necessary.