The music industry, its creators and users alike are voicing their thoughts on the opportunities and worries around generative AI technology. Three related stories today offer the latest snapshot of how those views are evolving.

First, the boss of British industry body UK Music, Jamie Njoku-Goodwin, has set out his views on what AI technologies mean for music. It’s a now-familiar blend of welcoming the potential of ‘assistive’ AI tech for musicians, while outlining concerns about deepfakes and unlicensed training.

There is a new phrase though. “Music laundering – a process where you could steal someone’s work, feed it into an AI, and then generate clean, ‘new’ music, just as a money laundering operation might do with stolen money,” wrote Njoku-Goodwin. “Particularly perversely, creators could be having their work stolen and used to train an AI without even being aware.”

“There is a complete lack of transparency around the ingestion process for AIs and without this, it will be difficult, if not impossible in some cases, to hold bad actors to account. That has to change. We need far greater transparency and detailed record keeping about that process as a first step towards working together to create a system which safeguards human creativity and human connection through music, while fostering rewarding innovation.”

The big challenge here, of course, is that good actors will be happy to be transparent, but those bad actors… won’t. Who, if anyone, will be qualified to not just enforce this record-keeping, but also to check that it’s accurate? An unanswered question, but the UK Music post is a useful summary of the industry’s current regulatory wishlist.

Elsewhere, The Verge has been surveying more than 2,000 US adults on their views about AI, finding that only a third of respondents have tried the new wave of AI tools (e.g. ChatGPT, Bing Chat, Snapchat’s My AI, Google’s Bard) – and that this use is “dominated” by millennials and Gen-Z.

Of people who’ve used image generating AIs like Midjourney and Stable Diffusion, 44% have asked the AI to copy a [visual] artist’s style, but 43% believe companies should ban copying artists, and 70% believe artists should be compensated when AI copies their work.

That will go down well with our industry if these views carry across into musical AIs, as will the finding that 76% of respondents think it should be illegal to create video or audio deepfakes that imitate a real person without that person’s consent.

Encouragingly, the music industry’s desire to protect and recompense its creators in the AI age may be in sync with the views of wider internet users – something that hasn’t always been the case with disruptive new technologies. That’s certainly a strong point when lobbying politicians and regulators.

Finally, popular production-focused site Bedroom Producers Blog has surveyed more than 1,500 music producers for their views on AI, providing a good snapshot of their concerns and excitement.

How do they feel about AI in music production? 34.8% of respondents said they feel positive while 17.3% feel negative, although the biggest slice – the 47.9% who feel neutral – indicates that the jury is still out for nearly half of producers. 36.8% of producers surveyed are already using AI tools in their work, with mixing and mastering tools the most popular category followed by AI music generators.

The survey includes existential questions (only 16.3% think AI music generators will ever be able to fully replicate a human producer) and copyright (36.8% think AI-generated music should be public domain, while 28.6% think the end user should own the copyright, and 22.7% think it should be shared between the user, AI tool developer and source-art creator).

EarPods and phone

Tools: platforms to help you reach new audiences

Tools :: Wyng

Through Music Ally’s internal marketing campaign tracking, we’ve recently discovered an interesting website by the…

Read all Tools >>

Music Ally's Head of Insight