Summary: Study identified the neural markers of beat synchronization in the brain and sheds light on how auditory perception and motor processes work together.
Source: McGill University
How do people coordinate their actions with the sounds they hear? This basic ability, which allows people to cross the street safely while hearing oncoming traffic, dance to new music or perform team events such as rowing, has puzzled cognitive neuroscientists for years. A new study led by researchers at McGill University is shining a light on how auditory perception and motor processes work together.
Keeping the beat – it takes more than just moving or listening well
In a recent paper in the Journal of Cognitive Neuroscience, the researchers, led by Caroline Palmer, a professor in McGill’s Department of Psychology, were able to identify neural markers of musicians’ beat perceptions. Surprisingly, these markers did not correspond to the musician’s ability to either hear or produce a beat – only to their ability to synchronize with it.
“The authors, as performing musicians, are familiar with musical situations in which one performer is not correctly aligned in time with fellow performers – so we were interested in exploring how musician’s brains respond to rhythms. It could be that some people are better musicians because they listen differently or it could be that they move their bodies differently,” explains Palmer, the Canada Research Chair in Cognitive Neuroscience of Performance, and the senior author on the paper.
“We found that the answer was a match between the pulsing or oscillations in the brain rhythms and the pulsing of the musical rhythm – it’s not just listening or movement. It’s a linking of the brain rhythm to the auditory rhythm.”
Super-synchronizers – an exception or a learnable skill?
The researchers used electroencephalography (EEGs involve placing electrodes on the scalp to detect electrical activity in the brain) to measure brain activity as participants in the experiment, all of them experienced musicians, synchronized their tapping with a range of musical rhythms they were hearing. By doing so they were able to identify neural markers of musicians’ beat perceptions that corresponded to their ability to synchronize well.
“We were surprised that even highly trained musicians sometimes showed reduced ability to synchronize with complex rhythms, and that this was reflected in their EEGs,” said co-first authors Brian Mathias and Anna Zamm, both PhD students in the Palmer lab. “Most musicians are good synchronizers; nonetheless, this signal was sensitive enough to distinguish the “good” from the “better” or “super-synchronizers”, as we sometimes call them.”
It’s not clear whether anyone can become a super-synchronizer, but according to Palmer, the lead researcher, it may be possible to improve ones ability to synchronize.
“The range of musicians we sampled suggests that the answer would be yes. And the fact that only 2-3 % of the population are ‘beat deaf’ is also encouraging. Practice definitely improves your ability and improves the alignment of the brain rhythms with the musical rhythms. But whether everyone is going to be as good as a drummer is not clear.”
Funding: An NSF Graduate Fellowship to B. Mathias, a PBEEE Graduate award from FRQNT to A. Zamm, an NSERC-USRA award to P. Gianferrara, and NSERC Grant 298173 and a Canada Research Chair to C. Palmer.
Rhythm Complexity Modulates Behavioral and Neural Dynamics During Auditory–Motor Synchronization
We addressed how rhythm complexity influences auditory–motor synchronization in musically trained individuals who perceived and produced complex rhythms while EEG was recorded. Participants first listened to two-part auditory sequences (Listen condition). Each part featured a single pitch presented at a fixed rate; the integer ratio formed between the two rates varied in rhythmic complexity from low (1:1) to moderate (1:2) to high (3:2). One of the two parts occurred at a constant rate across conditions. Then, participants heard the same rhythms as they synchronized their tapping at a fixed rate (Synchronize condition). Finally, they tapped at the same fixed rate (Motor condition). Auditory feedback from their taps was present in all conditions. Behavioral effects of rhythmic complexity were evidenced in all tasks; detection of missing beats (Listen) worsened in the most complex (3:2) rhythm condition, and tap durations (Synchronize) were most variable and least synchronous with stimulus onsets in the 3:2 condition. EEG power spectral density was lowest at the fixed rate during the 3:2 rhythm and greatest during the 1:1 rhythm (Listen and Synchronize). ERP amplitudes corresponding to an N1 time window were smallest for the 3:2 rhythm and greatest for the 1:1 rhythm (Listen). Finally, synchronization accuracy (Synchronize) decreased as amplitudes in the N1 time window became more positive during the high rhythmic complexity condition (3:2). Thus, measures of neural entrainment corresponded to synchronization accuracy, and rhythmic complexity modulated the behavioral and neural measures similarly.