This shows a baby surrounded by musical notes.
Over the first six months, these networks become more stable and specialized. Credit: Neuroscience News

Newborns’ Brains Recognize Complex Sound Patterns

Summary: New research shows that newborns can detect complex sound patterns that follow non-adjacent, language-like rules, suggesting that the ability to process such sequences is innate. Using near-infrared spectroscopy, researchers observed newborn brain responses to sequences of tones, finding that infants could distinguish between correct and incorrect patterns.

The study found that this early ability activates language-related networks, particularly in the left hemisphere, highlighting a foundation for future language skills. By six months, these networks become more specialized, showing the impact of early sound exposure on brain development.

This discovery points to the importance of early auditory experiences and opens potential for musical interventions in infants to support language growth. These findings are especially relevant for babies in under-stimulating environments.

Key Facts

  • Newborns can detect non-adjacent acoustic patterns, a skill essential for language.
  • Language-processing areas of the brain are activated by sound sequences from birth.
  • Early sound exposure may help develop language-related brain networks.

Source: Univesity of Vienna

A team of researchers, including psycholinguist Jutta Mueller from the University of Vienna, has discovered that newborns are capable of learning complex sound sequences that follow language-like rules.

This groundbreaking study provides long-sought evidence that the ability to perceive dependencies between non-adjacent acoustic signals is innate.

The findings were recently published in the prestigious journal PLOS Biology.

It has long been known that babies can learn sequences of syllables or sounds that directly follow one another. However, human language often involves patterns that link elements which are not adjacent.

For example, in the sentence “The tall woman who is hiding behind the tree calls herself Catwoman,” the subject “The tall woman” is connected to the verb ending “-s,” indicating third-person singular.

Language development research suggests that children begin to master such rules in their native language by the age of two. However, learning experiments have shown that even infants as young as five months can detect rules between non-adjacent elements, not just in language but in non-linguistic sounds, such as tones.

“Even our closest relatives, chimpanzees, can detect complex acoustic patterns when embedded in tones,” says co-author Simon Townsend from the University of Zurich.

Pattern Recognition in Sounds is Innate

Although many previous studies suggested that the ability to recognize patterns between non-adjacent sounds is innate, there was no clear-cut evidence—until now.

The international team of researchers has provided this evidence by observing the brain activity of newborns and six-month-old infants as they listened to complex sound sequences. In their experiment, newborns—just a few days old—were exposed to sequences where the first tone was linked to a non-adjacent third tone.

After only six minutes of listening to two different types of sequences, the babies were presented with new sequences that followed the same pattern but at a different pitch. These new sequences were either correct or contained an error in the pattern.

Using near-infrared spectroscopy to measure brain activity, the researchers found that the newborns’ brains could distinguish between the correct and incorrect sequences.

Sounds Activate Language-Related Networks in the Brain

“The frontal cortex—the area of the brain located just behind the forehead—played a crucial role in newborns,” explains Yasuyo Minagawa from Keio University in Tokyo.

The strength of the frontal cortex’s response to incorrect sound sequences was linked to the activation of a predominantly left-hemispheric network, which is also essential for language processing.

Interestingly, six-month-old infants showed activation in this same language-related network when distinguishing between correct and incorrect sequences.

The researchers concluded that complex sound patterns activate these language-related networks from the very beginning of life. Over the first six months, these networks become more stable and specialized.

Early Learning Experiences Are Key

“Our findings demonstrate that the brain is capable of responding to complex patterns, like those found in language, from day one,” explains Jutta Mueller from the University of Vienna’s Department of Linguistics.

“The way brain regions connect during the learning process in newborns suggests that early learning experiences may be crucial for forming the networks that later support the processing of complex acoustic patterns.”

These insights are key to understanding the role of environmental stimulation in early brain development. This is especially important in cases where stimulation is lacking, inadequate, or poorly processed, such as in premature babies.

The researchers also highlighted that their findings show how non-linguistic acoustic signals, like the tone sequences used in the study, can activate language-relevant brain networks.

This opens up exciting possibilities for early intervention programs, that could, for example, use musical stimulation to foster language development.

About this neurodevelopment and auditory neuroscience research news

Author: Alexandra Frey
Source: University of Vienna
Contact: Alexandra Frey – University of Vienna
Image: The image is credited to Neuroscience News

Original Research: Open access.
Functional reorganization of brain regions supporting artificial grammar learning across the first half year of life” by Simon Townsend et al. PLOS Biology


Abstract

Functional reorganization of brain regions supporting artificial grammar learning across the first half year of life

Pre-babbling infants can track nonadjacent dependencies (NADs) in the auditory domain. While this forms a crucial prerequisite for language acquisition, the neurodevelopmental origins of this ability remain unknown.

We applied functional near-infrared spectroscopy in neonates and 6- to 7-month-old infants to investigate the neural substrate supporting NAD learning and detection using tone sequences in an artificial grammar learning paradigm.

Detection of NADs was indicated by left prefrontal activation in neonates while by left supramarginal gyrus (SMG), superior temporal gyrus (STG), and inferior frontal gyrus activation in 6- to 7-month-olds.

Functional connectivity analyses further indicated that the neonate activation pattern during the test phase benefited from a brain network consisting of prefrontal regions, left SMG and STG during the rest and learning phases.

These findings suggest a left-hemispheric learning-related functional brain network may emerge at birth and serve as the foundation for the later engagement of these regions for NAD detection, thus, providing a neural basis for language acquisition.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.