New! Sign up for our free email newsletter.
Science News
from research organizations

In the mood for music

Date:
June 26, 2013
Source:
Inderscience
Summary:
Could a computer distinguish between the moods of a mournful classical movement or an angst-ridden emo rock song? Research suggests that it should be possible to categorize music accurately without human listeners having to listen in.
Share:
FULL STORY

Could a computer distinguish between the moods of a mournful classical movement or an angst-ridden emo rock song? Research to be published in the International Journal of Computational Intelligence Studies, suggests that it should be possible to categorise music accurately without human listeners having to listen in.

An experimental algorithm developed by researchers in Poland could help the record industry automate playlist generation based on listener choices as well as allow users themselves to better organise their music collections.

Multimedia experts Bozena Kostek and Magdalena Plewa of Gdansk University of Technology, point out that so-called "meta data" associated with a music file becomes redundant in a large collection where lots of pieces of music will share basic information such as composer, performer, copyright details and perhaps genre tags. As such, conventional management of music content of the kind used by web sites that stream and suggest music as well as the software used on computers and portable music players is often ineffective. Handling vast music collections, which might contain hundreds, if not tens of thousands of song excerpts with overlapping meta data is increasingly difficult, especially in terms of allowing streaming sites and users to select songs across genres that share particular moods.

Of course, music appreciate is highly subjective as is appreciation of any art form. "Musical expressivity can be described by properties such as meter, rhythm, tonality, harmony, melody and form," the team explains. These allow a technical definition of a given piece. "On the other hand, music can also be depicted by evaluative characteristics such as aesthetic experience, perception of preference, mood or emotions," they add. "Mood, as one of the pre-eminent functions of music should be an important means for music classification," the team says.

Previous mood classification systems have used words, such as rousing, passionate, fun, brooding, wistful in clusters to help categorise a given piece. There are dozens of words to describe a piece of music and that each might be associated with various emotions. The team has turned to a database of mp3 files containing more than 52,000 pieces of music to help them develop a statistical analysis that can automatically correlate different adjectives and their associated emotions with the specific pieces of music in the database.

Fundamentally, the algorithm carries out an analysis of the audio spectrum of samples from each track and is "taught" by human users, which spectral patterns are associated with given moods. It can thus automatically classify future sound files with which it is presented across a range of musical genres: alternative rock, classical, jazz, opera and rock. Artists including Coldplay, Maroon 5, Linda Eder, Imogen Heap, Paco De Lucia, Nina Sky, Dave Brubek and many others were analysed, the team says.


Story Source:

Materials provided by Inderscience. Note: Content may be edited for style and length.


Journal Reference:

  1. Bożena Kostek and Magdalena Plewa. Parametrisation and correlation analysis applied to music mood classification. International Journal of Computational Intelligence Studies, 2013; 2: 4-25

Cite This Page:

Inderscience. "In the mood for music." ScienceDaily. ScienceDaily, 26 June 2013. <www.sciencedaily.com/releases/2013/06/130626162737.htm>.
Inderscience. (2013, June 26). In the mood for music. ScienceDaily. Retrieved April 19, 2024 from www.sciencedaily.com/releases/2013/06/130626162737.htm
Inderscience. "In the mood for music." ScienceDaily. www.sciencedaily.com/releases/2013/06/130626162737.htm (accessed April 19, 2024).

Explore More

from ScienceDaily

RELATED STORIES