Newborn Babies Have Inbuilt Ability to Pick Out Words

Summary: Using fNIRS, researchers discovered babies are able to pick out words from speech at as young as three days old.

Source: University of Manchester.

A research study of newborn babies has revealed that humans are born with the innate skills needed to pick out words from language.

The international team of researchers discovered two mechanisms in 3-day-old infants, which give them the skills to pick out words in a stream of sounds.

The discovery provides a key insight into a first step to learning language.

The study, published in Developmental Science, is a collaboration between scientists at SISSA in Italy, the Neurospin Centre in France, the University of Liverpool and The University of Manchester. It was funded by the European Research Council.

One of the mechanisms discovered by the team is known as prosody- the melody of language, which allow us to recognise when a word starts and stops.

And another they call the statistics of language, which describes how we compute the frequency of when sounds in a word come together.

Dr Alissa Ferry from the University of Manchester said: “We think this study highlights how sentient newborn babies really are and how much information they are absorbing.

“That’s quite important for new parents and gives them some insight into how their baby is listening to them.”

Dr Ana Flò of Neurospin said: “Language in incredibly complicated and this study is about understanding how infants try to make sense of it when they first hear it.

“We often think of language as being made up of words, but words often blur together when we talk. So one of the first steps to learn language is to pick out the words.

“Our study shows that at just 3 days old, without understanding what it means, they are able pick out individual words from speech.

“And we have identified two important tools that we are almost certainly born with, that gives them the ability to do this.”

baby
Using a painless technique called Near-Infrared Spectroscopy, which shines light into the brain, they were able to measure how much was absorbed, telling them which parts of the brain were active. NeuroscienceNews.com image is in the public domain.

The researchers played the infants a 3 and a half minute audio clip in which four meaningless words, were buried in a stream of syllables.

Using a painless technique called Near-Infrared Spectroscopy, which shines light into the brain, they were able to measure how much was absorbed, telling them which parts of the brain were active.

Dr Perrine Brusini of the University of Liverpool noted: “We then had the infants listen to individual words and found that their brains responded differently to the words that they heard than to slightly different words.

“This showed that even from birth infants can pick out individual words from language.

About this neuroscience research article

Source: Mike Addelman – University of Manchester
Publisher: Organized by NeuroscienceNews.com.
Image Source: NeuroscienceNews.com image is in the public domain.
Original Research: Abstract for “Uncovering Biologically Coherent Peripheral Signatures of Health and Risk for Alzheimer’s Disease in the Aging Brain” by Ana Fló, Perrine Brusini, Francesco Macagno, Marina Nespor, Jacques Mehler, and Alissa L. Ferry in Developmental Science. Published January 25 2019.
doi:10.1111/desc.12802

Cite This NeuroscienceNews.com Article

[cbtabs][cbtab title=”MLA”]University of Manchester”Newborn Babies Have Inbuilt Ability to Pick Out Words.” NeuroscienceNews. NeuroscienceNews, 29 January 2019.
<https://neurosciencenews.com/language-newborns-10657/>.[/cbtab][cbtab title=”APA”]University of Manchester(2019, January 29). Newborn Babies Have Inbuilt Ability to Pick Out Words. NeuroscienceNews. Retrieved January 29, 2019 from https://neurosciencenews.com/language-newborns-10657/[/cbtab][cbtab title=”Chicago”]University of Manchester”Newborn Babies Have Inbuilt Ability to Pick Out Words.” https://neurosciencenews.com/language-newborns-10657/ (accessed January 29, 2019).[/cbtab][/cbtabs]


Abstract

Newborns are sensitive to multiple cues for word segmentation in continuous speech

Before infants can learn words, they must identify those words in continuous speech. Yet, the speech signal lacks obvious boundary markers, which poses a potential problem for language acquisition (Swingley, 2009). By the middle of the first year, infants seem to have solved this problem (Bergelson & Swingley, 2012; Jusczyk & Aslin, 1995), but it is unknown if segmentation abilities are present from birth, or if they only emerge after sufficient language exposure and/or brain maturation. Here, in two independent experiments, we looked at two cues known to be crucial for the segmentation of human speech: the computation of statistical co‐occurrences between syllables and the use of the language’s prosody. After a brief familiarization of about 3 minutes with continuous speech, using functional near‐infrared spectroscopy (fNIRS), neonates showed differential brain responses on a recognition test to words that violated either the statistical (Experiment 1) or prosodic (Experiment 2) boundaries of the familiarization, compared to words that conformed to those boundaries. Importantly, word recognition in Experiment 2 occurred even in the absence of prosodic information at test, meaning that newborns encoded the phonological content independently of its prosody. These data indicate that humans are born with operational language processing and memory capacities and can use at least two types of cues to segment otherwise continuous speech, a key first step in language acquisition.

Feel free to share this Neuroscience News.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.