Amper Music, the AI composition company, has been looking at how humans respond to AI-generated music and if they can tell if its been made by computers or humans (and if, ultimately, that even matters). It has worked with audio research firm Veritonic and got a number of panelists to listen to music created by AI as well as stock music in a range of videos to see if they could hear the difference.
“Users could tell no discernible deference to either, signalling a major shift for the future of music-making and content creation,” the company says. “Further, over 1/4 of panelists would have a more positive brand perception if they knew the music in an ad was composed by AI.” It added that the panelists showed no strong preference for the stock and AI music they heard side by side.
Of course Amper has a dog in the race here, but it is interesting that it has commissioned a study looking into the public’s perception of AI-generated music. This is also arguably part of the rolling movement to change how AI music is understood by consumers at large. What is critical to all of this is that it was being tested alongside stock music; so perhaps this shows that the context of consumption vis-à-vis background music is such that people don’t actually pay that much attention to what music is playing behind the visuals they are watching.
There has been a lot of hand wringing about AI music and if “the machines” are going to be the pop stars of tomorrow; but perhaps what this research really tells us is that stock music is the natural home for AI music rather than the top 20.