Summary: Researchers are exploring how machines and humans can work together to compose new songs.
Source: Kingston University.
Could artificial intelligence (AI) be part of the future of music creation? Kingston University experts are taking a leading role in new research exploring how machines and humans can work together to compose music.
Investigating how the latest technology could be used as a creative partner by musicians, researchers from Kingston University and Queen Mary University of London have been training a cutting-edge computer system to produce new compositions based on what it has learned about a particular type of music.
Choosing Irish folk music for the study – due to its relatively well-defined structure and a wealth of available data – the AI system was taught more than 23,000 tunes using a text-based music notation format. This enabled it to generate new tunes by drawing upon the patterns and structures it has learned.
“We didn’t expect any of the machine-generated melodies to be very good – but we, and several other musicians we worked with, were really surprised at the quality of the music the system created,” he said.
The results were so impressive that some of the tunes will be performed live at a concert later this month as part of the project, according to Dr Oded Ben-Tal, senior lecturer in music technology at Kingston University.
In addition to using the AI system to create new musical material, Dr Ben-Tal worked interactively alongside it – inputting an initial sequence of notes and then selecting from the subsequent notes produced by the system. Through this process, they were able to compose a piece of music together.
While dismissing any suggestion that song-writing machines could eventually replace humans, Dr Ben-Tal believes artificial intelligence based systems could end up being the perfect tool for amateur composers in need of some inspiration.
“For beginners, a system like this would help get you started and avoid the intimidating aspect of composing your own tune as you could work interactively together,” the music technology expert, who is based within the University’s Faculty of Arts and Social Science, explained. “Meanwhile, an experienced composer could work with the system to generate new ideas by using their own musical concepts as a starting point.”
Dr Ben-Tal also emphasised that the intention of the research was to show how AI could be used to enhance the creative process – not replace it. He also hopes it could go on to be used by music teachers and students in the future.
“People are reluctant to believe machines can be creative – it’s seen as a very human trait,” he said. “However, the fact of the matter is, technology and creativity have been interconnected for a long time and this is just another step in that direction. “One thing we need to remember though is that this system doesn’t think of music in the way we do. It’s able to produce interesting results but it has no understanding of the context.”
The £150,000 study, funded through the Arts & Humanities Research Council (AHRC) and French-based The Cluster of Excellence Labex, will culminate in a concert in London later this month.
The concert will include a performance, on the recently retuned church organ, of harmonisations of tunes from the AI system in the style JS Bach’s chorales, created by another computer programme that also uses machine learning methods.
“This shows how two computer systems can work together to co-create a score. This concert also highlights the important role the composer and performer has in bringing the music to life,” said Dr Ben-Tal. “This opens up a whole new world of possibilities for music making.”
Tickets can be purchased for the concert which takes place at St Dunstan and All Saints Stepney, London on May 23 2017.
Source: Kingston University
Image Source: NeuroscienceNews.com image is adapted from the Kingston University news release.
Video Source: The video is credited to The Bottomless Tune Box.