The first market being targeted by artificial-intelligence music startups like Jukedeck and Amper Music is production music, with AI systems capable of generating music to be used in online videos, mobile games and other digital content.
What does that mean for the existing synchronisation sector of the music industry? A panel organised in London this week by FastForward explored some of the issues.
The panel included Rachel Menzies, music supervisor at Native Music Supervision & Production; Alex Black, global director at EMI Production Music; Jon Eades, innovation manager at Abbey Road Red; and Patrick Stobbs, co-founder and COO at Jukedeck. The moderator was Chris Carey, CEO and founder of FastForward and Media Insight Consulting.
Talking points included the quality of AI-composed music; copyright ownership; how AI production capabilities differ from those of humans; the processes within the sync sector; and the challenges posed by AI to that area.
Black raised the question of AI’s ability to produce original creations in the traditional sense, citing AI’s dependence on taking in vast amounts of music and producing future works based largely on an existent body of created material.
Stobbs suggested that this may not be so different from how humans spend countless hours studying and learning from existing work.
His view: that it is entirely possible with a software-based AI songwriting process to reveal what music an AI has been exposed to, and how that exposure has influenced the composition of an original piece. In Jukedeck’s view (unsurprisingly) this provides a strong argument for AI-composed music as original work.
Menzies brought up one of the more common ways AI music is currently used: to create large amounts of music very cheaply, for specific purposes (like online-video soundtracks) that might otherwise be served by more-expensive sync deals with labels.
Stobbs noted that currently, the limitations of music created by AI makes it most suitable for uses where creators just need something in the background, rather than uses where a higher level of ‘sonic recognition’ is required.
“Take the John Lewis Christmas ad, for instance. They want to hear a song that some people grew up with it, remember dancing on their bed to. So that’s completely wrong for AI music. Any situation where AI music replaces perfectly good sources of music is pretty boring…and looking at the new opportunities is key,” he said.
Rather than replacing artists, Stobbs hopes that artificial intelligence in music creation – much as computers and software have become an influence for decades – will allow creators to focus more on the top levels of their composition, ultimately freeing time to focus on crafting “killer melodies, or lyrics, or other things”.
The panel reached some agreement on the necessity of works produced by AI systems to stand on their own merit as qualitatively, acoustically competitive against works produced by human artists, if AI is going to have a significant impact on the music industry beyond these early uses.
When considering the sync-related merit of whether a song was written ‘by an artist or a robot,’ Menzies stressed the high level of expectation that companies often have: principally, hopes for songs that are far from actual budgetary availability.
“People will pay half a million pounds for adverts that have certain music in them because the advertiser hopes the audience will buy into a certain type of nostalgia from the track. People pay for that familiarity, but not every single one of the tracks we use will be a familiar track,” she said.
Menzies went on to describe the specific requirements that a music supervisor will face when syncing a track, laying out a scenario from brief to completion.
“If we’re working on a brand commercial, they’ll come and say ‘our budget is x, we’re looking for a French gypsy jazz track that features an accordion and builds at certain points’ – or some briefs will just say ‘I want a track that’s blue’ – and dissecting briefs, budgets and available music, clients can really get their heart set on a track that’s out of budget,” she said.
“So then they might pay for a composed piece that sounds similar, and ultimately could go with a production piece. Having all of those three offerings available is so important to working with sync clients.”
Eades offered insight to the coming years and likely influence of music created by artificially-intelligent systems, stating that it’s unlikely the industry will see a significant shift in the source of created music (from humans to artificial intelligence) on the majority of consumption.
Eades also raised the point that “it is a false assumption and a pervasive concept that artificial intelligence cannot break the rules, and cannot create unexpected value.”
Primarily hopeful, Eades and Stobbs both pointed to AI and computers’ general impacts on all sorts of industries, with overall impacts of leading to industry-wide re-skilling toward higher-skilled, more-specialised roles within an industry.
Evaluating the overall competition that might be introduced to the industry by artificial intelligence in coming years, Carey closed with a poignant point: AI is going to affect different levels of songwriters in different ways.
“Hans Zimmer isn’t going to lose a score because there’s some AI out there now writing like Hans Zimmer, but for the person who’s working up to being the next Hans Zimmer, that’s where career redirection and repositioning becomes difficult,” he said.
These types of discussions will continue at FastForward’s upcoming events–the foremost being the company’s inaugural London conference on September 15th.