Semantic Music

I have a pretty extensive music collection, and for some time I’ve been looking for some way to make it respond to my requests more intelligently. Taking the idea of semantic web over to music, every song should have some basic metadata, and generally, all my files are pretty well tagged: Artist, Title, Album, Disc, Tracknumber, etc. It’s when I try to extend the metadata to more useful things that I run into trouble.

Perhaps the most useful classification a song can have is ‘Mood’. Unlike other music metadata, there isn’t any standard mood classification for music. MoodLogic, which probably has the best mood database on the net (its commercial tho), classifies mood into “Aggressive, Mellow, Happy, Romatic, Sad, and Upbeat”, with the possibility of multiple selections for each song. Other popular classifications range around the idea of having a scale (from 1 to 10, say) to indicate how mellow or aggressive a song is. Whatever the method however, its difficult to get proper mood metadata into your song without having to dish dollars for it. Allmusic.com’s style tags might suffice for albums, but for single tracks, I have to manually update every song.

Another useful but exotic tag is BPM (beats per minute). Its more easier to find BPM than mood for music, and combined with mood, it paves the way for playlists with extraordinary flexibility. Imagine making an aggressive playlist with a medium BPM just for those morning wake-up calls =).

It’s tagging thats the difficult part, once I get that done, manipulating my music files to make wacky playlists is going to be amazing. I wish there were free tools to make things easier tho’.

Leave a Reply

Create a website or blog at WordPress.com