Newborns Develop Language Skills Within Hours of Birth

Summary: Within hours of birth, newborns begin to distinguish between sounds and become sensitive to language information.

Source: Bangor University

In contrast to the traditional view of newborns, passively lying around and crying, a recent study published in Nature Human Behaviour established that newborns start soaking up and tuning into the specifics of the world around them within hours, including the specific languages that they’ll speak.

Babies are known to start learning language by hearing speech even when they’re in the womb, but can’t hear the detail as it’s muffled, as if underwater.

The study, with international contributors including Gary Oppenheim and Guillaume Thierry of Bangor University’s School of Human and Behavioral Sciences, worked with newborns, starting within just minutes of their birth using a combination of vowels played forward (that is, naturally) and played backwards (a time-reversed version of the sound).

Using optical imaging, a non-invasive form of neuroimaging, to measure changes in the body, the process involved shining tiny torches (i.e., flashlights) at the babies’ scalps. The light shines into the body, and some bounces back and depending on what’s going on in the body (e.g., how much oxygenated blood is in an area of the brain), a little more or a little less light will bounce back.

To obtain accurate results, multiple torches were used, with their power and placement precisely controlled, as well as very precise light detectors to measure tiny changes in how much light bounces off.

Recordings of spoken vowels were played and then tested to see whether their brains responded differently when they heard these same vowels being played backward versus forward.

In the first test, the babies could not distinguish between forward and backward vowels, as it is a very subtle contrast (even adults fail such discrimination test 70% of the time).

After merely five hours of exposure to this contrast, optical imaging showed that the newborns’ brains started distinguishing between the two sounds. And after a further two hours, during which the newborns mostly slept, the exposure to the vowel contrast triggered a spurt of connectivity, with neurons talking to each other on a large scale, as if they had been inspired by the language sounds they heard.

Guillaume Thierry, professor of cognitive neuroscience, said, “Our research showed that a very subtle distinction—even for the adult ear—is enough to trigger a significant brain activity surge in the newborn’s brain, showing that early experiences have potentially major consequences for cognitive development.

This shows brain maps from the study
After merely five hours of exposure to this contrast, optical imaging showed that the newborns’ brains started distinguishing between the two sounds. Credit: The researchers

“In other words, we should challenge the myth that babies are mostly unaware of their environment until after a few weeks, simply because they sleep a lot, and pay attention to what babies are exposed to from the moment when they are born.”

Gary Oppenheim, lecturer in psychology, added, “When my son was born, I was surprised to see that he was immediately alert, his eyes wide open and looking around to soak in information about his strange new environment (even though a newborn’s vision is known to be quite poor).

“The work that a newborn’s ears and auditory system are doing isn’t as obvious to the naked eye, but this spectacular result shows we have remarkable sensitivity to language information from the very moment we are born and we immediately set to work developing and refining it in response to our experiences in the world, even when we appear to be just sleeping.”

About this neurodevelopment and language research news

Author: Press Office
Source: Bangor University
Contact: Press Office – Bangor University
Image: The image is credited to the researchers

Original Research: Open access.
Rapid learning of a phonemic discrimination in the first hours of life” by Yan Jing Wu et al. Nature Human Behavior


Rapid learning of a phonemic discrimination in the first hours of life

Human neonates can discriminate phonemes, but the neural mechanism underlying this ability is poorly understood. Here we show that the neonatal brain can learn to discriminate natural vowels from backward vowels, a contrast unlikely to have been learnt in the womb.

Using functional near-infrared spectroscopy, we examined the neuroplastic changes caused by 5 h of postnatal exposure to random sequences of natural and reversed (backward) vowels (T1), and again 2 h later (T2). Neonates in the experimental group were trained with the same stimuli as those used at T1 and T2.

Compared with controls, infants in the experimental group showed shorter haemodynamic response latencies for forward vs backward vowels at T1, maximally over the inferior frontal region. At T2, neural activity differentially increased, maximally over superior temporal regions and the left inferior parietal region.

Neonates thus exhibit ultra-fast tuning to natural phonemes in the first hours after birth.

Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.