Summary: A new computer interface allowed participants to convey their emotions through music by changing elements of the musical tune.
Source: University of Durham
New research conducted by experts from Durham University’s Department of Music found that people are able to convey particular emotions through music by changing certain elements of the musical tune.
The researchers created an interactive computer interface called EmoteControl which allows users to control six cues (tempo, pitch, articulation, dynamics, brightness, and mode) of a musical piece in real-time.
The participants were asked to show how they think seven different emotions (sadness, calmness, joy, anger, fear, power, and surprise) should sound as music. They did this by changing the musical cues in EmoteControl, essentially allowing them to create their own variations of a range of music pieces that portrayed different emotions.
In general, musical cues were used in a similar way to represent a specific emotion. For example, participants conveyed sadness in the music using a slow tempo, minor mode, soft dynamics, legato articulation, low pitch level, and a dark timbre.
Tempo and mode were the two cues that highly affected the emotion being conveyed, while dynamics and brightness cues had the least effect on shaping the different emotions in the music.
The researchers also found out that sadness and joy were amongst the most accurately recognized emotions, which correlate with previous studies.
Professor Tuomas Eerola of Durham University said that “this interactive approach allowed us to tap into the participants’ perception of how different emotions should sound like in music and helped the participants create their own emotional variations of music that encompassed different emotional content.”
This research and the EmoteControl interface have implications for other sectors where emotional content is conveyed through music, such as sound branding (marketing), music in film and TV, adaptive music in gaming, as well as the potential to be used as an emotion communication medium for clinical purposes.
An Interactive Approach to Emotional Expression Through Musical Cues
Previous literature suggests that structural and expressive cues affect the emotion expressed in music. However, only a few systematic explorations of cues have been done, usually focussing on a few cues or a limited amount of predetermined arbitrary cue values. This paper presents three experiments investigating the effect of six cues and their combinations on the music’s perceived emotional expression. Twenty-eight musical pieces were created with the aim of providing flexible, ecologically valid, unfamiliar, new stimuli.
In Experiment 1, 96 participants assessed which emotions were expressed in the pieces using Likert scale ratings. In Experiment 2, a subset of the stimuli was modified by participants (N = 42) via six available cues (tempo, mode, articulation, pitch, dynamics, and brightness) to convey seven emotions (anger, sadness, fear, joy, surprise, calmness, and power), addressing the main aim of exploring the impact of cue levels to expressions. Experiment 3 investigated how well the variations of the original stimuli created by participants in Experiment 2 expressed their intended emotion. Participants (N = 91) rated them alongside the seven original pieces, allowing the exploration of similarities and differences between the two sets of related pieces.
An overall pattern of cue combinations was identified for each emotion. Some findings corroborate previous studies: mode and tempo were the most impactful cues in shaping emotions, and sadness and joy were amongst the most accurately recognised emotions. Novel findings include soft dynamics being used to convey anger, and dynamics and brightness being the least informative cues.
These findings provide further motivation to investigate the effect of cues on emotions in music as combinations of multiple cues rather than as individual cues, as one cue might not give enough information to portray a specific emotion.
The new findings and discrepancies are discussed in relation to current theories of music and emotions.