A new study sheds light on the mechanisms behind metacognition and how we make sense of the world around us.
Study points to the evolutionary and developmental similarities between sensory cells in the inner ear and skin.
Our perception of musical timing is closely linked to the quality of the sound.
According to a new study, the absence of the pejvakin molecule appears to be responsible for noise induced hearing loss.
fMRI brain scans reveal semantic tuning during both reading and listening to words are highly correlated in selective areas of the cerebral cortex. The new brain maps enabled researchers to accurately predict which words would active specific regions of the cortex.
Researchers have successfully recreated the whisker map a mouse creates of its surroundings to help it navigate the world, catch food and avoid predators.
Researchers link the harmonic structure in pop songs to their ultimate popularity. Findings suggest unexpected chord changes when followed by predictable harmonies help make a song popular.
When listening to a phrase over and over, the words often begin to sound like a song. Researchers found this speech-to-song illusion does not decrease with age.
··5 min read
An auditory-based machine learning algorithm was able to identify children diagnosed with depression and anxiety with 80% accuracy after analyzing recordings of their speech. The algorithm identified eight audio features that signify a higher risk of depression. Of these, a lower pitch of voice, repeatable speech inflections and a higher pitch response to surprise stimuli, were more indicative of depression. Researchers hope to develop a smartphone app that records and analyzes speech immediately, helping to better detect children at risk of internalizing disorders.