There are times when I am on the road and showing the good folk of the music industry Musiio’s Artificial Intelligence music tagging, live and in person. 

I usually ask customers to pick the track, to guarantee a fair test, and I drop it into our web tagging demo to be tagged by our AI in real-time.

Interestingly, people usually do one of two things; they either give me something straightforward to kick-off like Taylor Swift or Ed Sheeran, or they throw me a curve ball.

One of my favourite curve balls recently, was in London, where I was asked to try Bring Me The Horizon’s (BMTH) 2019 track, Nihilist Blues Ft. Grimes. Check out these results:

AI Generated Tags for Nihilist Blues by Bring Me The Horizon Featuring Grimes

As the results popped up, I wasn’t sure what I was seeing. I didn’t know this specific track, but I do know BMTH are a heavy rock band, but the genre came back as Electronic and our AI was 100% confident. From here I go into investigation mode, step one is always to listen to the track, and sure enough, the AI was correct!

https://www.youtube.com/watch?v=iwzfR7-33Wc

Since 2004, BMTH were always known as a ‘heavier’ sounding Rock/Metal band, but their material released in 2019 sounds very different. 

Our AI had outperformed me as a music industry professional. I’d assumed there should be a 'Rock' or ‘Metal’ genre tag in the results, but the AI selected the correct genre based on the audio. 

At the time of this test, I checked many of the top music services globally and they all had Nihilist Blues categorised as Rock or Metal.

When music companies automate the tagging process most will have BMTH already listed as a Rock artist, and copy this data across all the bands tracks. This approach saves time, but doesn’t account for when artists develop their sound or switch genres, and genre-bending artists pose an even bigger challenge.

Our audio-based AI doesn’t categorise music by what made it famous, or the first thing it ever did. It tags the music based on what it ‘hears’ in the MP3 or WAV file, based on the features we extract in our processing. This helps us tag music with an accuracy of 90-99.9% per tag and in volumes of up to 1,000,000 tracks per day.

Of course you can overlay contextual data, such as charts and stream counts for example. But if the underlying assumption about the track is wrong then the contextual data won’t be as valuable, if at all.

Lots of the world’s best artists evolve their sound. That’s good for them, but not so good for catalogue management. 

Check out my next post for an exploration of other artists who have inverted genres across their careers! 


Share this story