British AI-music startup Jukedeck has been on the music industry’s radar for three years now, since TechCrunch’s August 2014 report on its “responsive music software” that could write music like a human would, for royalty-free use in online videos.
That December, Jukedeck won a startup prize at the Le Web conference in Paris, although it would be another year’s work before the official launch of its service in December 2015, accompanied by a £2.5m funding round.
Anyone could use Jukedeck’s tool to create music: choosing a genre, mood, tempo, instruments and track length, then giving the resulting song a name and – if it’s suitable for your soundtrack needs – paying as little as $0.99 for a royalty-free licence, or $199 to buy the copyright outright.
Music Ally picked Jukedeck as a startup to watch in 2016, and its CEO Ed Newton-Rex appeared at the AI music event we ran with UK industry body the BPI that November, claiming that “AI will change the music industry and lots of other industries a lot more than the internet did”.
That kind of view is being talked about a lot more in 2017, with startups including Amper Music, Popgun, AI Music, Groov.ai, Rave and AIVA as well as the efforts of Google (with its Magenta project) and Sony (with its Flow Machines).
We wondered how things were progressing with Jukedeck, which has been relatively quiet this year. Newton-Rex says that its team has been hard at work.
“We’ve basically been focused on research and building this technology. We’re working with neural networks, with machine learning, so what we spend most of our time doing is working on our machine-learning models, making them better,” he says.
“The thing about this field is that there’s a lot of legwork. A couple of years ago, AI wasn’t at the stage where it could write a piece of music good enough for anyone. Now it’s good enough for some use cases, like video creators. But it’s still far from perfect: we’ve still got a lot of work to do.”
Jukedeck will be making some product announcements “soon” but Newton-Rex says that the wider discussion around AI’s implications for music are timely – not just for the music industry, but for the wider tech world.
“One of the reasons people are interested in this problem is that it gets to the heart of something that until recently people weren’t really thinking about in terms of AI, which is creativity,” he says.
“AI has been getting more and more useful in more fields for a while now. It wasn’t long ago that people were saying AI would never be able to drive a car. Now, once they see these systems working, they realise that it can. The same thing happened with chess and with Go,”
“People thought an AI beating the world champion at Go was 10 years away, then Google DeepMind did it [in two]. But creativity is still quite a new thing: even when we started, people didn’t necessarily believe that this was something that computers could do. But now they realise that this stuff is possible.”
Newton-Rex, like other AI-music execs, has to walk a fine line between making bold predictions about the long-term capabilities of this technology, and alienating musicians and their rightsholders who are worried about the impact these systems will have on working musicians.
He stresses that Jukedeck has a team full of musicians – many of them semi-professional – and pitches the company’s technology as having the potential to open up music, rather than disrupting human musicians out of their incomes.
“What really drives us is two main things. First, you can democratise music. As soon as AI understands a bit more about how to write music, you can put that power into a lot more people’s hands. People who aren’t classically-educated can play and tinker with music, which is really exciting,” says Newton-Rex.
“The other side is the personalisation aspect, in terms of consuming music. Recorded [human] music is brilliant and will never die out, and it won’t be replaced by AI. But once you have AI, you can really personalise the way music is consumed.”
“You can give every person in the world their own personal composer, and music can respond to anything from their environment to their mood or their calendar. It’s those twin goals of democratisation and personalisation that get us out of bed in the morning.”
The questions – and in some quarters real anger – from musicians and songwriters are nevertheless an important factor in this year’s debates about AI music. Some see it as an existential threat, while others disdain it as destined to fall far short of rivalling human creativity.
Newton-Rex frames these arguments in another way, comparing them to the reaction to the emergence of synthesizers in the late 1970s and early 1980s, to loud protests from some parts of the music community.
“It was ‘This isn’t real music, only acoustic instruments matter!’ And the thing is, there is a certain merit to that. There will always be people who don’t want to use AI like they don’t want to use electronic instruments,” he says.
“That’s a choice, which is brilliant. But it’s no reason to stop people who do want to use AI – which could be millions of people – from using it. Every technological change ruffles a few feathers.”
He adds that some musicians have actually started using Jukedeck’s tools for their own creative purposes, seeing AI as a new frontier rather than a dangerous tide.
“More and more artists in general are starting to look at what neural networks are capable of, and not just for music. They see this as culturally and artistically interesting, and want to know what they can do with these systems,” says Newton-Rex.
This may already be influencing Jukedeck’s product roadmap. “When it comes to actual musicians – people who want to make music – we have lots planned in that regard,” he says.
What about the arguments over whether AI can ever truly match human creativity and compositional skills? Newton-Rex has a blunt response to the oft-cited comparison of these systems with the most popular human musicians.
“It doesn’t need to be better than Adele or Ed Sheeran. There’s no desire for that, and what would that even mean? Music is so subjective. It’s a bit of a false competition: there is no agreed-upon measure of how ‘good’ a piece of music is,” he says.
“The aim [for AI music] is not ‘will this get better than X’ but it’s about whether this will be useful for people. Will it help them? People will use pure AI music if they want a track for a backing video. People will listen to human-composed music. And people will make and listen to a hybrid of the two.”
There may be a generational divide here for listeners. In Music Ally’s recent profile of Popgun, its CEO Stephen Phillips pointed out that tens of millions of children have grown up with generative music in games like Minecraft and Roblox, so may not have an instinctive resistance to AI music that some older listeners do.
Newton-Rex also hopes that one by-product of the various AI-music startups may be some new understandings of our own neural processes as humans.
“The most interesting thing about all this is that it might give us an insight into how the human composition process works. We don’t really know how composition works: it’s hard to define it, or even creativity in general,” he says.
“In building these systems, we start to ask questions about how does this system work in the human brain? That’s a really interesting question.”
When it comes to Jukedeck’s ‘brain’ it turns out that much like humans, sometimes its best music comes from mistakes: when it does something it wasn’t supposed to, and produces an interesting melody or sound as a result.
“Miles Davis said that there are no wrong notes in jazz. You hit a wrong note, as he probably quite often did, repeat it a few times, and suddenly it’s a feature! Something you think is appalling the first time you hear it, you start to understand it and maybe even grow to like it,” says Newton-Rex.
“It’s really hard in music to truly objectively say what’s good and what’s bad. It’s really just about what you like, and what you like can be predicated on what you’ve heard before.”
Newton-Rex also has some useful context to provide around Spotify’s recent poaching of AI-music expert Franćois Pachet from Sony’s Computer Science Laboratory in Paris.
In 2016, Pachet’s team released two tracks that were created by a collaboration between AI and humans. ‘Daddy’s Car’ was a Beatles-y number, while ‘The Ballad of Mr Shadow’ was influenced by Irving Berlin, George Gershwin and other classic American songwriters.
News of his move to Spotify broke amid heavy coverage of the so-called ‘fake artists’ commissioned to provide tracks for the streaming service’s mood playlists – sleep, relaxation, focus etc.
It’s reasonable to wonder whether hiring Pachet is a sign that Spotify sees a potential role for AI in generating music for these kinds of playlists. The prospect of AI artists on Spotify is guaranteed to spook labels, publishers and musicians alike. But is this really a possibility?
“I have no idea about what Spotify is doing, or streaming services in general. But to the question ‘one day will a piece of software be able to compose music that knows you, and can put you to sleep?’ the answer is absolutely,” says Newton-Rex.
“That’s exactly the kind of field in which AI can really help and be useful. From music to help you sleep to lullabies for babies, things like that. And it could be personalised: how close are you to sleep, what kind of music do you like, and so on. There’s huge potential.”
Read Music Ally’s other recent AI music coverage
Startup AI Music reveals its plans for ‘shape-changing’ songs
Playing next door to Alice: Popgun reveals its music AI
Google’s A.I. Duet makes an algorithm your piano partner