iTunes OCD

You’d think that having an iPod would be an endless parade of musical bliss. And, mostly, it is. But the one worm in the apple is that now that you have 5,000 songs in the library, and you have to rate them.

“Oh, come now,” you say. “You don’t have to rate those songs.” And maybe you’d be right in a sort of narrowminded, positivist, literal way. But when you’re talking about me, you’re talking about someone who eats all of the green M&M’s first, and is convinced that they actually taste better. The robots demand that I rate my music; therefore, I must rate it. I am a man without free will.

So the first problem you face is what the ratings actually “mean.” iTunes lets you rate songs anywhere from 0 to 5 stars. I quickly disposed of the “0” rating – with 5,000 songs, I needed some way to distinguish those that I haven’t rated at all. So a zero rating means “not rated yet.”

It only took me a few days to assign semantic meanings to each of the ratings:

The rating isn’t everything, of course. Ratings are focused on songs, and that doesn’t tell the whole story. For example, The Ex’s Joggers and Smoggers album is composed of bits of found sound and rhythm that, considered on their own, are probably 1 or 2 stars no matter who is “rating” them; but taken as a whole, the album is a masterwork – the songs need their neighbors. This is true also for the brief, nameless interstitial audiocollages on The Loud Family’s Days For Days. But at least for your average pop song, it’s a workable system. I just took a look at the distribution of ratings, and it is pleasingly bell-shaped, with perhaps a slight bias towards low ratings.

The next problem is what to do with the ratings. Sure, I made the inevitable smart playlists and such, but fundamentally I’ve generated this mass of data, and I’m a geek, so: I want to data mine it. What music do I like? Off the top of my head, if you said “Who are your favorite artists?” I’d say: Aimee Mann, Loud Family (a.k.a Game Theory), Nick Cave, Tom Waits, and Shadowy Men on a Shadowy Planet. Does the view of my library I get when I look at my individual song ratings match what I would say that I like if you asked me?

It turns out that iTunes doesn’t have any built-in way to do this easily. Lots of people have written scripts to do various things, but I was only able to find one script that came close to what I wanted to do: a script which ranked albums, rather than artists. So as a first cut, I decided to run it and see what it said.

** Rank **

| ** Artist ** | ** Album **

** Rating **

1

|

David Bowie

|

Scary Monsters

|

4.2

2

|

Aimee Mann

|

Bachelor No 2

|

4.08

3

|

Nick Cave And The Bad Seeds

|

Live Seeds

|

4.0

4

|

They Might Be Giants

|

Lincoln

|

3.78

5

|

Nick Cave And The Bad Seeds

|

Let Love In

|

3.7

Hmmm, not too bad. It’s a deceptive view, though. The album ranking script is brittle. In particular, it will only rank albums for which you’ve rated every single song. Since I know for a fact I haven’t even come close to that yet, I know it’s missing a lot of data (and by the very nature of rating psychology, there’s probably a bias to rate songs that you like before rating songs that you don’t like). Furthermore, the “album” is the wrong unit of measure for this sort of averaging. Going back to my Loud Family example: Days for Days has 10 superb songs that I rated highly and 10 little interstitial pieces that I gave low ratings because they don’t really stand on their own. This kills the curve. The end result is I have a script telling me that I think, subconsciously, that The Proclaimers' Sunshine on Leith is marginally better than Days for Days, which isn’t true at all.

So, I decided to take the script and see if I could use it as a base to generate some more interesting views of the data. This meant I had to play with Applescript. Have you ever used Applescript? It’s like Cobol, but less versatile. Oh my God, what an ugly language. Basically, you can have your script interact with iTunes, sending iTunes commands asking it to give you the ratings for all the songs by such and such an artist. Really, if you want to do something like this, I think a better way would be to just parse the iTunes Library XML database directly, rather than using Applescript. But, what can I say, I was lazy.

So with just a little tweaking, I got a list of artists and rankings, rather than albums. The invariants were a bit different: you didn’t have to have rated every song by an artist, but only rated songs count towards an artist’s score. Artists with fewer than 10 rated songs weren’t included at all:

** Rank **

| ** Artist ** | ** Rating **

—|—|—

1

|

Nick Cave And The Bad Seeds

|

3.85

2

|

Aimee Mann

|

3.81

3

|

DJ Z-Trip & DJ P

|

3.8

4

|

Kate Bush

|

3.64

5

|

Me First And The Gimme Gimmes

|

3.42

6

|

Nick Cave

|

3.35

7

|

Paola & Chiara

|

3.33

8

|

Michiru Oshima

|

3.29

9

|

Billy Bragg

|

3.29

10

|

Jane Siberry

|

3.26

11

|

Richard Thompson

|

3.25

12

|

Tom Waits

|

3.22

Better, but still some odd results. Fundamentally, I started converging on the idea that using the mean of the score in number of stars is broken. The stars have a semantic meaning which doesn’t really map smoothly on a real 0-5 scale. So I gave it one more try.

This time, I decided that what I really wanted to know was: which artists have the strongest repertoires (subjectively) in my library? So I decided to use the idea of “hot songs.” If I gave a song a 4 or 5 rating then – by my personal scale, described above – it was a song that I’d be glad to listen to nearly any time. Whether it’s a 4 or a 5 doesn’t matter while I’m listening to it. If a song is below that threshold, it doesn’t really matter whether it’s a 1, 2, or 3. Back to the Applescript editor once again (oh, the pain, the unending pain) to generate a list of artists. This time, the score is an integer between 0 and 100, which is approximately “number of ‘hot’ songs by that artist in the library, divided by the total number of songs by that artist in the library, * 100”. So an artist for whom I rated every song a 4 or 5 would get a score of 100, and an artist who got no 4s or 5s would get a 0. I also decided to actually display the number of songs affecting the calculation, rather than just showing the score. I find this a bit more interesting:

** Rank **

| ** Artist ** | ** Rating **

| ** # Hot Songs ** | ** # Songs **

—|—|—|—|—

1

|

Aimee Mann

|

59

|

22

|

37

2

|

Bob Mould

|

55

|

6

|

11

3

|

Billy Bragg

|

52

|

11

|

21

4

|

Nick Cave

|

45

|

14

|

31

5

|

Nick Cave And The Bad Seeds

|

44

|

18

|

41

6

|

Lou Reed/John Cale

|

40

|

6

|

15

7

|

Indigo Girls

|

40

|

4

|

10

8

|

DJ Z-Trip & DJ P

|

39

|

9

|

23

9

|

Pixies

|

38

|

36

|

94

10

|

Game Theory

|

37

|

10

|

27

11

|

Tom Waits

|

36

|

49

|

135

12

|

Karl Hendricks Trio

|

34

|

23

|

68

13

|

Talking Heads

|

33

|

5

|

15

14

|

The Ex + Tom Cora

|

33

|

4

|

12

15

|

Me First And The Gimme Gimmes

|

33

|

4

|

12

16

|

Mary’s Danish

|

33

|

4

|

12

17

|

Los Straitjackets

|

33

|

4

|

12

18

|

Paola & Chiara

|

31

|

8

|

26

19

|

Wall Of Voodoo

|

30

|

3

|

10

20

|

The Sisters of Mercy

|

30

|

3

|

10

Now we’re starting to approximate my subjective worldview much better. Kate Bush drops completely out of the top 20 (thank God, my hipster status is secure, unless people notice that Paola e Chiara are there also). Furthermore, it’s immediately apparent which of the top 20 members are outliers, just by seeing the low number of tracks that have been rated. I have one Indigo Girls album; it’s the one that is universally known as “the good one,” and it has 4 songs on it that I really like. That’s enough to give them a probably unjustly high position in the list. So this version of the script is unfairly generous to artists who only have a few tracks in the library. Probably the next step is to give a bonus to artists who have many tracks in the library, and a penalty to artists who only have a few (David Bowie, for example, has pretty much the same rating as Roxy Music, even though they only have 2 highly rated songs in my library, and he has 30. That’s unjust. Unjust, I tell you!)

And, of course, I’ll need to continue with the arduous, backbreaking work of listening to all my music. And rating it.

The original Applescript that I used as the pattern for all of this stat generation can be found [here](http://www.malcolmadams.com/itunes/scripts/scri pts01.php?page=1#albumranking). If you’ve got a script that you like that does something similar, please mention it in the comments below. I can’t possibly be the only person in the world that is obsessive-compulsive about this.