With 100,000 new tracks being uploaded to streaming services daily, there’s never been a greater need for expert music curation. AI music tagging can help curators make sense of new releases and build new playlists, but there’s nothing like the insights offered by music experts.
These experts can give context to music, and help listeners form a deeper connection with the material. They can turn background music into a listener’s favourite new track.
That’s the power of music radio.
But if an expert was to listen to 100,000 3-minute tracks, it would take over 200 days to listen to them all. Tech can help support, filter and offer extra insights to the expert curators so they can focus on the portion of music they want rather than having to listen to everything.
To demonstrate the kinds of insights that a possible with AI, we analysed tracklists from three BBC Radio 6 Music shows to see what we could learn.
By examining Craig Charles’ Funk and Soul show, the Indie Forever show and The Morning After Mix, we’ve aimed to build a basic fingerprint of each programme. This can then be used as a north star to help guide each show’s sound. If the editorial talent behind each show is looking for a helping hand, these can be used to sense-check potential tracks, filter a large music catalogue, or add a dose of inspiration for a deeper dive into the musical archives.
Importantly, because each track gets tagged with over 20 pieces of metadata, it’s easier to avoid the scenario where an inappropriate track gets played, causing a listener to tune out. If integrated with music scheduling software such as RCS GSelector, this could help radio programmers increase audience retention.
Funk & Soul show
Air time: Saturday, 6pm
Craig Charles’ Funk and Soul show is much-loved by the 6 Music listenership and is as likely to feature sing-along classics as deep cuts and updated, re-energised DJ edits. Airing at 6pm on a Saturday night, it serves as a warm-up show for nights out up and down the country. And we can see this in the data.
Energies are either medium or high, and mood valence is 95% positive. This is a feel-good show.
As you’d expect, the most common genres are Soul, Funk and Early Soul, with the occasional Afrobeat and Reggae tag. Equally expected are Soulful and Playful moods being strongly represented.
Interestingly, the most popular tempo range for this show is 100-119 BPM, which is extremely danceable without being “ravey”; more a pre-club sound.
When assessing music for the show, the above criteria could be used to see how well each track fits into the sonic blueprint. Anything too far out of bounds would be at the discretion of the curator.
The Morning After show
Air time: Sunday, 4am
Made for people who need to settle down after a big night, The Morning After Mix has a completely different vibe.
For curators, there are a couple of easy filters that could be used to assess suitability for this show. First is energy: the majority of tracks have low energy. That could be the first filter when searching a catalogue.
Next is BPM. The majority of tracks are slower than 99 BPM. Mood valence is largely neutral, rather than uplifting or depressing. Beyond that, moods such as Melancholy, Peaceful and Dreamy, all feature most prominently.
The result is a playlist with an overwhelming number of folk tracks, the Folk genre tag accounting for 39 per cent of tracks.
Indie Bangers show
Air time: Saturday, 12am
The Indie Forever show on Radio 6 is a classic specialist show and is largely populated with indie rock tracks from the noughties. We can lend a few insights into what this show sounds like with the Musiio tagging AI.
The most popular tempo range is 120-139BPM. Track energies are mostly high, with some medium and some very high.
Interestingly, the most common mood tag by a significant margin was Carefree. Less surprising was the mix of most common genre tags (Rock, Alternative Rock, Indie Rock, Indie).
How could this help radio show producers and programmers?
There are a few applications for this kind of data in a more traditional radio context. The first is to ensure that the selected music fits with the sound of a show. A more interesting application still is using this data to filter track submissions or candidate tracks from the catalogue.
However, the most powerful application of this data could be when integrated into internet radio systems such as Radio.co.
When paired with detailed listener data, it could be possible to measure how many people tuned out while listening to a specific track. Then, you can understand what tracks work, and in what context, with a musically anatomical understanding of why, to maximise audience retention.
Got an application in mind and want to talk about AI with us? Drop us a message via our contact form, on Twitter, LinkedIn or email email@example.com.