XR, Voice-as-UI, Artificial Intelligence: in StartupLand, buzzwords abound as attention-grabbing shorthand. In the real world, however, a little more caution must be applied when being lured in by the promise that buzzword implies: sure, your app uses Quantum Machine Learning, but what does that actually mean, and what’s the difference?
Of those buzzwords, Artificial Intelligence is one that has become all-pervasive: it’s a label that’s been appended to a lot of existing services as an indicator of smartness.
AI that works well does a terrific job – particularly when it’s doing things that humans are slow or fallible at, such as organising and tagging music. At its core, German startup Cyanite offers a similar AI music-analysis service as some previous Startup Files subjects: its bespoke AI scans songs, identifies moods, extracts song data, and tags songs so that humans can navigate big sets of them.
MyPart does a great job at identifying emotions and patterns across huge swathes of music and lyrics, and Musiio has been analysing and tagging catalogues of songs for a number of years.
Cyanite sees more nuanced and innovative new implementations of this AI technology on the horizon, and has been working with clients like Universal Production Music and BPM Supreme to identify these opportunities – from helping DJs pick better songs more quickly, to recommending TV shows based on the music they contain.

What is Cyanite, and what is it trying to do?
Cyanite has been operating between its twin headquarters of Berlin and Mannheim since 2017, and we spoke to two of its co-founders, Markus Schwarzer (CEO) and Jakob Höflich (CMO) about what their technology can do now, what it can do next – and how, in an era of huge catalogue acquisitions, they want to act as the Google Translate of music tagging.
Höflich says that in the simplest sense, Cyanite wants to give music companies opportunities “using state of the art tech that they can’t develop themselves.” Cyanite has created AI-driven music analysis with a focus on B2B music, because they think that 99% of music online “is not found or monetised and we want to plug in there and find music for every case where it could be used. The music intelligence we are developing is like a translator between music and music business professionals, and the people who work with music in games, ads, podcasts and so on.”
Schwarzer explains how they want to provide tech that works regardless of a B2B user’s starting point. He picks a sync example: “a sound branding agency searches for music from different standards to a film agency for instance – they need to have brand fit, they need numbers to convince people. That differs greatly from the film market, which searches for very abstract or visual keywords, or needs to swap out the score in weeks when they sell content to third party territories.”
And then that differs again, he explains, from the UGC market on YouTube for instance. “So we are building one universal engine that solves this.”
There are a lot of AI music analysis companies on the market – so what does Cyanite do and what is the difference?
Höflich explains that the platform has three main uses, and the first, music tagging, is the most familiar.
The second is Cyanite’s similarity search and recommendation algorithm. British library music company Cinephonix uses its similarity search on their website, for instance, and it’s used by publishers for internal music searches. It’s also used in more novel ways: German radio station SWR used it to pioneer live music radio where listeners can shuffle in a new song to replace one they don’t like. Meanwhile, Virgin Music Label & Artist Service has used it in their social media campaigns to create custom audiences for emerging artists without existing fanbases.
Thirdly, Höflich says, is their “translation” technology, which can unite very different sets of music data: “it’s what we call keyword cleaning. We take huge chunks of metadata from libraries and publishers and clean it to remove redundant and false tagging.”
Schwarzer says that they uncover a lot of tagging mistakes. “With Universal Production Music (UPM), we identified 16,000 tagging mistakes. We literally cleaned up their mistakes – the wrong tags applied to the music. Their tags and keywords were good, but mistakes can be made by the person tagging on the day.”
Schwarzer explains that tagging precision is what they work hardest at, and, he says, is winning over clients: “BPM Supreme – and three very large enterprise companies that I can’t name – benchmarked us against the opposition and went with us.” The difference, he says, comes from the initial data generation that trained their AI: “we asked thousands of people how they felt after being exposed to music – and this real emotional data makes our mood classification precise.”
He thinks that means they understand better how their prospective B2B customers approach tagging. “You can ask four different music professionals what they think ‘uplifting’ music is, and you get four different ideas. So our tech understands what that user’s idea of ‘uplifting’ is and then applies it.” The result, he claims, is more accurate and cheaper AI learning.

The love of lexicon
This precision is not merely useful, Höflich says – the way a customer interprets music has a big effect on the lexicon of tags one company chooses to use compared to another.
“We can’t just shove our own AI tagging system down their throats,” Schwarzer says. “Customers said to us, ‘we have our own tagging and we don’t want to change it.’” Thus Cyanite works in their clients’ own language and interprets every company’s unique tagging language (UPM, for instance, has 1400 unique tags) – and can then re-tag songs using their unique lexicon.
More interestingly, it can then apply the language of another company – or join together two different catalogues with accuracy. “With UPM,” he says, “we can translate their entire tagging to any other tagging taxonomy that they give us – without any sort of information loss.”
This means a third party could search the UPM library using their own proprietary tagging taxonomy, and not the internal UPM language – and still get the kind of results they’re familiar with.
Schwarzer points out how this cross-catalogue searching is extra important in the current era of expensive, wholesale catalogue acquisition by Hipgnosis et al – whose business model depends on wringing every last cent from their purchases. “Hipgnosis are buying a huge set of data and tags, and huge music intelligence along with the songs – they can do so much more with it. We could translate all of those different tags from the different catalogues into their own taxonomy.”
Just like SyncFloor – profiled previously in Startup Files – discovered, it’s vital to allow people working on the front line of monetising music to find it in the language they are accustomed to.
Schwarzer continues: “It’s like a translator between different languages: at Hipgnosis they speak in Hipgnosis music language, but people at Netflix speak Netflix music language. Making these people understand each other is very important in order to get a mutual understanding of music.”
How is Cyanite is working with the industry already – and how would they like to in the future?
This year, Cyanite has started converting the AI interpretations of music into brand values and more abstract parameters, Schwarzer explains. “A music-describing parameter might be ‘uplifting and happy’ – but the brand values would be ‘mysterious, elegant, optimistic, positive, homely,’ etc. You can ask people on the street what a “homely’ song is and you get one hundered different answers. But those music professionals have a specific idea of a ‘homely’ song.”
Schwarzer says this brand-tagging has already worked for Amp Sound Branding and Universal Music Solutions in Germany, and that they are also working on projects with Dutch agencies Create Music and Tambr, and Antfood in the USA.
Cyanite aims to help agencies back up their song selections with data, and Schwarzer says the benefits of this are calculable: “you can cut out the music decision process. If you pay $50,000 for a Janis Joplin song, everyone wants to be involved – but if you can find a song that is 20k cheaper and has a better data score… we find music that helps them be more on-brand.”
For BPM Supreme, the digital record pool platform used by major DJs, the challenge was different: the company wanted to help DJs find new and appropriate music faster and more intuitively, “to spend less time searching and more time being creative,” as Höflich puts it.
So Cyanite tags BPM Supreme’s ever-growing catalogue with tagging designed for DJs. “We got a lot of feedback from the community and they love it. People are so used to searching Spotify by moods, and at an independent company like BPM Supreme it’s hard to offer this. It’s very tough to have consistency and objectivity and would be expensive in terms of time and money. They have 160,000 songs. A professional tagger can do 40k songs a year – that would be four years’ salary. They could create their own AI and that would cost four times that again! Or they could go with us and pay a fraction.”
What’s next for Cyanite?
Höflich is very excited by a proof-of-concept music recommendation project they are doing with one of the world’s major entertainment companies (he’s also under a watertight NDA.) “For this company, music is one of the most important factors in what they make. So we are analysing both their music and their videos, and we work that into their recommendation system.”
Cyanite’s technology analyses what a specific user of a video on demand service watches, and then recommends other video content based on the mood and usage of music in videos they have previously watched – combined with other factors like the time of the day. “Music is so connected to every entertainment industry… it could benefit the recommendation engines of all the creative industries,” Höflich says.
Höflich says they are also working on open text search – where users could describe a film scene – he suggests “a couple in love are driving on the highway towards the sunset and deer jumps out in front of them” – and then they receive music recommendations.
“We want to pick up on the semantic meaning,” Schwarzer says, “not just the sum of the keywords.”
Cyanite’s business model
Cyanite has a simple SaaS model: there’s a subscription based on a user’s catalogue size which provides AI tagging as a basic service, and then services like similarity search and tagging translation are add-ons. The price is very dependent on catalogue size and use case, Schwarzer says, so people should contact him directly at markus@cyanite.ai.
Startup: Cyanite
Category: Music-Tech / Artificial Intelligence
Headquarters: Mannheim & Berlin
Management Team: Markus Schwarzer (CEO), Joshua Weikert (CTO), Roman Gebhardt (CAIO), Jakob Höflich (CMO)
Funding so far: 950,000 USD
Cyanite is currently seeking to develop relationships with: Production Music Libraries, Music Publishing, DSPs, Music Software, DJ and Beat Platforms, Database Suppliers
Contact details: markus@cyanite.ai