Major Is Happy, Minor Is Sad?
If you’ve ever heard anything about chords and harmony, you’ll no doubt have heard that “major is happy” and “minor is sad” – it’s a nice succinct way to explain the concept of harmony to those who have no foundational understanding of it.
But we all know that human beings have many more than just two emotions. Music, like all art, is often a reflection of our inner lives. Over the thousands of years of human history, we have developed a vast universe of language, color, and music to express the complexity of the human condition in spite of our limited circumstances.
Among the tool sets that allow us to describe emotion in music is a never ending Pandora’s box known as harmony. After listening to over 2 centuries of music (tens of millions of tracks), our AI has developed a logical system of categorizing music by “Genre”, “Mood” etc. based on its analysis of the mathematical properties of an audio file.
Breaking The Black Box
But like much of AI in general, exactly “how” it derives these conclusions is not something that we can claim to know fully. That being said, if we ask it questions of a limited sort, we may be able to discern its thought process, in the same way that a clinical psychologist might ask you probing questions to determine how your behavior today has been shaped by the experiences of your past.
Today, we are looking at one chord sequence played in various levels of harmonic complexity – Triads, Sevenths, and Ninths. The example I created and analyzed is pretty “unmusical”, with little to no stylistic/rhythmic information for the AI to go off of.
It is very simple, but no matter what kind of musical question we upload to it, it must return an answer. And simple questions can often be extremely difficult to answer – so let us explore the tags not in the sense of them being objectively correct or not (because more likely than not, the AI is “grasping at straws” here), but rather, as a window into the fascinating internal world of this particular artificial intelligence.
What is it trying to tell us?
Let’s dive into it! For the sake of brevity I will not be tackling the Energy tag, which is the same in all 3.
Jazz and Classical are the most probable primary genres. This is not too surprising, if you think about it. A close mic-ed solo piano without vocals, playing block chords – what’s the likelihood that it could be anything else?
Granted, our Genre taxonomy does not cover every single genre from every single culture in the world, but we’ve covered most Western music genres and out of those, it really could only be Jazz or Classical. Note the confidence scores (72% and 52% respectively), they’ll become interesting later.
100% Sad. At first glance, this seems wrong. But let’s think about it. The tempo is slow, the rhythm is uncluttered, the key is major, but the writing itself is not exuberant. You could probably put sad lyrics over it and they would fit just fine.
It could not be “Happy”, because assigning that emotional description to this music might make the word itself less meaningful in describing music that is “clearly” Happy.
Here we see Classical disappear entirely as a Primary Genre. That makes sense, because a sequence of Seventh chords in this ascending parallel manner is exceedingly rare in actual Classical music.
The Jazz score has been raised by 8% to 80% – a logical result, since seventh chords form the core of most Jazz music, though it is common to see the chords further enriched in arrangement and improvisation as a manner of stylistic practice.
88% Sad and 40% Romantic. This makes sense – Triads are bare, honest, and straightforward in their emotional expression. Sevenths on the other hand are a tiny bit emotionally ambiguous, and may have “softened” the Sad-ness slightly in this case.
The Romantic tag appearing is interesting. Is it more “Romantic” than triads? In the context of modern recorded music, if you think about Smooth Jazz, RnB, Soul, or even subsets of Pop and similar genres that might be classed as Romantic – they often have lots of seventh chords. Perhaps it is this that the AI is hearing.
97% Jazz, a 17% increase from the previous example. The addition of several layers of harmonic complexity to this solo piano excerpt places this solidly within the Jazz genre – it would be an exceedingly unlikely event to find a piece with all these qualities that is not also jazz.
The Ninths excerpt is Sad and Romantic, like the Sevenths excerpt, but in different ratios. We can see that the Sad score has jumped back up to 100%, and Romantic has fallen to 30%. It is clearly more emotionally nuanced than the Triad excerpt. I’ve thought about this for a long time and couldn’t come up with any explanation for the difference in score with the Sevenths excerpt – what do you think?
The strongest correlation that emerged in this exploration was this: more harmonic complexity = more jazz. If you’re a musician, you probably already know this, but it’s interesting to see that the AI learned this from listening to 2 centuries of music, without being actively taught this fact.
The second interesting fact here was that harmonic complexity can add emotional complexity as well, which can no doubt be further influenced by instrumentation, tempo, melody, and so on. Again, this is something that we know – but our AI is able to express this mathematically.
Want to chat about AI? Get in touch with us here!
Musiio's resident Music Strategist and Music nerd. I ran an orchestra for 5 years, a virtual industry community of 7000 for 3 years, and currently run a small globally distributed creative audio team and compose commercially. I also like rocks and cats.