Labels have been working hard to understand how smart assistants like Amazon’s Alexa decide what music to play when, for example, an Echo owner asks for some ‘happy indie music from the 1990s’.
These algorithms – and in some cases the metadata required to power them – are being created by Amazon’s engineering team. But imagine a scenario where Alexa starts to have her own opinions about culture and entertainment.
It’s on the cards. TechCrunch has an interesting interview with Amazon’s Fire TV vice president Marc Whitten, and while he’s talking about TV shows and films, his comments about Alexa’s future evolution can apply just as much to music.
“This is the 2018 version of the video buff at the video rental store,” he says. “This is the power of machine learning. One of the most interesting things we’re going at is how do you design an assistant that feels like you’re having a conversation with someone.”
The piece explains that Amazon’s longer-term goal is for Alexa to (in TechCrunch’s paraphrasing) “come up with more answers on her own starting with her own set of opinions that aren’t curated by an editorial team inside Amazon”, with Whitten describing Alexa operating without human curation as “the ambitious goal”.
It seems you can already ask Alexa what its favourite beer is and get an answer – Budweiser, if you’re wondering – that was not curated by a human.
What does this mean for music? A future where queries like ‘Who’s the best new artist?’ or ‘Play me something good’ generate responses without direct human curatorial involvement is a fascinating prospect – and one that extends to Google Assistant, Siri and other smart assistants through which we’ll be interacting with music.
Machines having their own opinions about music? Soon they’ll be arguing about bands in the pub and competitively bragging about how much better the early stuff was…
More seriously, understanding how this technology works and how people are interacting with it will be an even bigger priority for the music industry.