In the mood for music

Could a computer distinguish between the moods of a mournful classical movement or an angst-ridden emo rock song? Research to be published in the International Journal of Computational Intelligence Studies, suggests that it should be possible to categorise music accurately without human listeners having to listen in.

An experimental algorithm developed by researchers in Poland could help the record industry automate playlist generation based on listener choices as well as allow users themselves to better organise their music collections.

Multimedia experts Bozena Kostek and Magdalena Plewa of Gdansk University of Technology, point out that so-called "meta data" associated with a music file becomes redundant in a large collection where lots of pieces of music will share basic information such as composer, performer, copyright details and perhaps genre tags. As such, conventional management of music content of the kind used by web sites that stream and suggest music as well as the software used on computers and portable music players is often ineffective. Handling vast , which might contain hundreds, if not tens of thousands of song excerpts with overlapping meta data is increasingly difficult, especially in terms of allowing streaming sites and users to select songs across genres that share particular moods.

Of course, music appreciate is highly subjective as is appreciation of any art form. "Musical expressivity can be described by properties such as meter, rhythm, tonality, harmony, melody and form," the team explains. These allow a technical definition of a given piece. "On the other hand, music can also be depicted by evaluative characteristics such as , perception of preference, mood or emotions," they add. "Mood, as one of the pre-eminent functions of music should be an important means for music classification," the team says.

Previous mood classification systems have used words, such as rousing, passionate, fun, brooding, wistful in clusters to help categorise a given piece. There are dozens of words to describe a piece of music and that each might be associated with various emotions. The team has turned to a database of mp3 files containing more than 52,000 pieces of music to help them develop a statistical analysis that can automatically correlate different adjectives and their associated emotions with the specific pieces of in the database.

Fundamentally, the algorithm carries out an analysis of the audio spectrum of samples from each track and is "taught" by human users, which spectral patterns are associated with given moods. It can thus automatically classify future sound files with which it is presented across a range of musical genres: alternative rock, classical, jazz, opera and rock. Artists including Coldplay, Maroon 5, Linda Eder, Imogen Heap, Paco De Lucia, Nina Sky, Dave Brubek and many others were analysed, the team says.

More information: Kostek, B. and Plewa, M. Parametrisation and correlation analysis applied to music mood classification, International Journal of Computational Intelligence Studies, 2013, 2, 4-25.

Provided by Inderscience

Citation: In the mood for music (2013, June 27) retrieved 25 April 2024 from https://phys.org/news/2013-06-mood-music.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no part may be reproduced without the written permission. The content is provided for information purposes only.

Explore further

Stuck like a Baroque-n record? Music evolves in noteworthy ways

0 shares

Feedback to editors