Auditory NeuroscienceFeaturedNeuroscience·March 7, 2025·4 min readHow the Brain Processes Speech in Real TimeResearchers have developed a computational framework that maps how the brain processes speech during real-world conversations. Using electrocorticography (ECoG) and AI speech models, the study analyzed over 100 hours of brain activity, revealing how different regions handle sounds, speech patterns, and word meanings. The findings show that the brain processes language in a sequence—moving from thought to speech before speaking and working backward to interpret spoken words.Read More
FeaturedNeuroscience·March 3, 2025·5 min readHow the Brain Transforms Speech Pitch into MeaningA new study reveals that Heschl’s gyrus, once thought to only process sound, actually plays a crucial role in interpreting speech melody, or prosody. Researchers tracked brain activity in epilepsy patients with implanted electrodes and found that this region encodes pitch accents as meaningful linguistic signals, separate from word sounds.Read More
FeaturedGeneticsNeuroscience·February 18, 2025·8 min readGenetic Link Between Humans and the Evolution of Speech FoundNew research suggests a genetic variant in the NOVA1 protein may have played a key role in the emergence of human speech. Scientists introduced this exclusively human variant into mice and observed altered vocalizations, indicating a potential role in vocal communication.Read More
FeaturedNeuroscience·December 17, 2024·4 min readBaby Babble Syncs With Heartbeats to Shape Speech DevelopmentInfants' heart rate rhythms are closely linked to their early vocalizations, including coos, babbles, and emerging words. Researchers found that babies were most likely to produce sounds when their heart rate reached a peak or trough, with speech-like sounds occurring during heart rate deceleration.Read More
Auditory NeuroscienceFeaturedNeuroscience·October 6, 2024·3 min readHearing Loss Disrupts Speech CoordinationA recent study reveals that hearing plays a vital role in coordinating speech movements. Researchers found that when individuals briefly couldn't hear their own speech, their ability to control their jaw and tongue movements declined.Read More
FeaturedNeuroscienceOpen Neuroscience ArticlesPsychology·October 2, 2024·3 min readHumans Slow Speech to Help Dogs UnderstandA new study reveals that humans naturally slow their speech when talking to dogs, which helps dogs better understand commands. Researchers analyzed speech rates and brain responses in 30 dogs and 27 humans across five languages, finding that humans speak at around three syllables per second to their pets, compared to four syllables when talking to other humans.Read More
FeaturedNeurologyNeuroscience·September 30, 2024·5 min readRight-Sided DBS: Effective Parkinson’s Treatment Without Speech LossA new study has shown that unilateral deep brain stimulation (DBS) on the right side of the brain improves motor symptoms in Parkinson's patients without causing significant declines in verbal fluency. Researchers found that DBS on the left hemisphere led to more noticeable declines in word retrieval and generation.Read More
FeaturedNeurologyNeuroscience·September 27, 2024·8 min readImproving Voice Recognition for People with Speech DisabilitiesA new study shows that automatic speech recognition (ASR) systems trained on speech from people with Parkinson’s disease are 30% more accurate in transcribing similar speech patterns. Researchers collected over 151 hours of recordings from participants with varying degrees of dysarthria, a speech disorder common in Parkinson’s patients, and used the data to train ASR systems.Read More
FeaturedNeuroscience·May 31, 2024·5 min readInfants Hear More Speech Than Music At HomeA new study compared the amount of music and speech infants hear at home. Researchers found that infants are exposed to more speech than music, with the gap increasing as they grow.Read More
FeaturedNeuroscienceOpen Neuroscience Articles·May 29, 2024·4 min readBabies’ Squeals and Growls Show Early Vocal Practice PatternsInfants' vocalizations, like squeals and growls, appear in significant clusters, suggesting active noisemaking play and sound practice. Researchers analyzed recordings from 130 infants and found 40% of squeals and growls clustered significantly.Read More
FeaturedNeuroscience·May 28, 2024·5 min readStuttering Linked to Specific Brain NetworkA new study has identified a common brain network responsible for stuttering, regardless of its cause. Researchers found that strokes causing stuttering and developmental stuttering both affect the same brain areas.Read More
Auditory NeuroscienceFeaturedNeuroscience·May 28, 2024·5 min readHow the Brain Distinguishes Music from SpeechA new study reveals how our brain distinguishes between music and speech using simple acoustic parameters. Researchers found that slower, steady sounds are perceived as music, while faster, irregular sounds are perceived as speech.Read More