Abbey Road is the last album recorded by the Beatles and it is one of my personal favorites. By this time in 1969 the members of the band didn’t get along with each other very well, mainly because they had reached maturity as artists and had artistic disagreements (there were also some personal and business disagreements). Nevertheless, they knew that Abbey Road was going to be their last album and so they decided to enjoy it and try to go with a blast. They were not artistically heavily influencing each other anymore and they did not have a constraining unifying theme for the album as was the case for Let It Be, so they were free to put out their best work. The result is an album in which each song (or pair of songs sometimes) is quite different from the rest of the songs. From an information theory point of view, each song is unexpected considering the previous songs: each song is ‘surprising’ and hence provides a large amount of information. Consider most albums in which the songs sound more or less the same, you can easily predict how the next song is going to sound even if you can't predict the specific melody.
I was listening to Abbey Road a couple of months ago and I realized that because the songs are so different, I could learn about my musical preferences by ranking the songs in Abbey Road. Then my brain went to whether I could relate these musical preferences to personality, buying habits, etc. and whether I could create an adaptive music algorithm to find the sets of songs from which you could learn the most about the subjects. Heck, at this point you could create ‘personalized albums’ to be delivered periodically à la Stitch Fix, mining social media to predict the songs that your customers would like (but that would 'surprise' them, not like Netflix that only recommends movies that you would watch anyways). But all of that is a little bit too far into the future. In the meantime, is there any evidence that this might even work?
Welcome to the catch of this week: David Greenberg’s research at University of Cambridge in the UK. His research is at the intersection of musical behavior, personality, and cognitive science and investigates what can the musical preferences of a person tell you about his or her cognitive style or personality. I am linking below two recent articles related to this and a press release. The link that I highly recommend is this one: http://www.musicaluniverse.org/. You can take a quiz about your music preferences by (individually) ranking 25 pieces of music. You can also take personality tests that would help Prof. Greenberg with his research, but you can completely skip them and go directly to the musical styles part.
Lastly, I show my results. The maximum score for any given category seems to be 45. In general I agree with the vectors that he has identified to describe musical styles and I am pasting the definitions from the website.
- Mellow music is defined as romantic, relaxing, unaggressive, sad, slow, and quiet; often heard in genres of soft rock, R & B, and adult contemporary;
- Unpretentious music is defined as uncomplicated, relaxing, unaggressive, soft, and acoustic, and primarily from the country, folk, and singer/songwriter music genres;
- Sophisticated music is defined as inspiring, intelligent, complex, and dynamic, and were from the classical, operatic, avant-garde, world beat, and traditional jazz music genres;
- Intense music is defined as distorted, loud, aggressive, and not relaxing, romantic, nor inspiring, and were from the classic rock, punk, heavy metal, and power pop music genres);
- Contemporary music is defined a percussive, electric, and not sad, and from the rap, electronica, Latin, acid jazz, and Euro pop music genres.
Musical tastes offer a window into how you think (Press Release):
Personality predicts musical sophistication: http://www.sciencedirect.com/science/article/pii/S0092656615000513
Musical Preferences are Linked to Cognitive Styles: http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0131151