Are There Differences in How Speakers of Other Languages Hear Music?

Scientists investigate the effect of ‘native listening’ on non-linguistic sounds.

Knowledge of our mother tongue acts as a sort of auditory “template” that influences the way we perceive the sounds of other languages. Scientists call this “native listening”. Several concepts, such as the fact that many of the cortical auditory regions responsible for linguistic and musical processing are the same, and the existence of auditory illusions dependent on the mother tongue or dialect, have led investigators to hypothesize that native listening transfers also to non-linguistic sound stimuli such as music. To investigate this hypothesis, Alan Langus and colleagues used the “iambic-trochaic law”, demonstrating that there is no transfer to the non-linguistic domain and that the distortion effects are limited to linguistic sounds.

The way we group notes within continuous sound sequences is determined by the iambic-trochaic law (ITL). This means we tend to pair sounds of varying intensity or pitch into trochees and those of different duration into iambs. An iamb is formed by two elements in which the stronger element follows the weaker one, and a trochee is exactly the opposite. In other words, when we listen to a flow of tones that alternate continuously – one strong (S) and one weak (W), we segment the flow in trochees we hear the sequences as SW-SW-SW-SW…, If we divide it into iambs we hear the sequence will be WS-WS-WS-WS…

According to the ITL, when sounds vary in either volume or pitch we tend to prefer the trochaic pattern. However, when there is variety in duration, we prefer iambs. Even the phrasal rhythm of a language follows either iambic or trochaic preferences, and each language has its characteristic rhythm: some prefer a iambic pattern (e.g. Italian) others a trochaic one (e.g. Turkish or Persian).

Image of a man in headphones.
Our native tongue influences the way we perceive other languages. But does it also determine the way we perceive nonlinguistic sounds? Image adapted from the SISSA press release.

In a series of experiments, Langus, Nespor and colleagues tested whether the preferred rhythm of the subjects’ mother tongues also transferred to non-linguistic sounds (musical tones), and also to visual stimuli. “In previous experiments, we found that iambic-trochaic rhythms also exist in the visual domain, and hopefully we would find an analogy between the auditory and visual domain given the existence of visual languages, such as sign languages for the deaf,” explains Nespor.

However, the experiments, conducted on native speakers of Italian, Persian or Turkish, provided negative results. It is true (“as we replicated in our study” explains Langus) that the rhythm of spoken language influenced the perception of the sounds of other languages. “However, we found no transfer of the effect to the other domains of non-linguistic auditory and visual stimuli” concludes the research scientist.

About this neuroscience and music research

Source: Federica Sgorbissa – SISSA
Image Credit: Image is adapted from the SISSA press release.
Original Research: Abstract for “Listening Natively Across Perceptual Domains?” by Langus, Alan; Seyed-Allaei, Shima; Uysal, Ertuğrul; Pirmoradian, Sahar; Marino, Caterina; Asaadi, Sina; Eren, Ömer; Toro, Juan M.; Peña, Marcela; Bion, Ricardo A. H.; and Nespor, Marina in Journal of Experimental Psychology: Learning, Memory, and Cognition. Published online January 28 2016 doi:10.1037/xlm0000226


Abstract

Listening Natively Across Perceptual Domains?

Our native tongue influences the way we perceive other languages. But does it also determine the way we perceive nonlinguistic sounds? The authors investigated how speakers of Italian, Turkish, and Persian group sequences of syllables, tones, or visual shapes alternating in either frequency or duration. We found strong native listening effects with linguistic stimuli. Speakers of Italian grouped the linguistic stimuli differently from speakers of Turkish and Persian. However, speakers of all languages showed the same perceptual biases when grouping the nonlinguistic auditory and the visual stimuli. The shared perceptual biases appear to be determined by universal grouping principles, and the linguistic differences caused by prosodic differences between the languages. Although previous findings suggest that acquired linguistic knowledge can either enhance or diminish the perception of both linguistic and nonlinguistic auditory stimuli, we found no transfer of native listening effects across auditory domains or perceptual modalities.

“Listening Natively Across Perceptual Domains?” by Langus, Alan; Seyed-Allaei, Shima; Uysal, Ertuğrul; Pirmoradian, Sahar; Marino, Caterina; Asaadi, Sina; Eren, Ömer; Toro, Juan M.; Peña, Marcela; Bion, Ricardo A. H.; and Nespor, Marina in Journal of Experimental Psychology: Learning, Memory, and Cognition. Published online January 28 2016 doi:10.1037/xlm0000226

Feel free to share this neuroscience news.
Join our Newsletter
I agree to have my personal information transferred to AWeber for Neuroscience Newsletter ( more information )
Sign up to receive our recent neuroscience headlines and summaries sent to your email once a day, totally free.
We hate spam and only use your email to contact you about newsletters. You can cancel your subscription any time.