New! Sign up for our free email newsletter.
Science News
from research organizations

Complex data becomes easier to interpret when transformed into music

Date:
October 30, 2023
Source:
Tampere University
Summary:
Researchers in the field of human-technology interaction have demonstrated how a custom-built 'data-to-music' algorithms can help to better understand complex data. The transformation of digital data into sounds could be a game-changer in the growing world of data interpretation.
Share:
FULL STORY

A team of researchers in the field of human-technology interaction at Tampere University and Eastern Washington University has demonstrated how a custom-built "data-to-music" algorithms can help us to better understand complex data. The transformation of digital data into sounds could be a game-changer in the growing world of data interpretation.

The five-year research project was carried out by group of researchers at TAUCHI, the Tampere Unit for Computer-Human Interaction at Tampere University, Finland and Eastern Washington University, the United States. The research was funded by Business Finland.

The group recently released a research paper that provides reasons for using musical sounds in the transformation of data as a means to provide a new dimension for interpretation.

The lead author of the article is Jonathan Middleton, DMA, a professor of music theory and composition at Eastern Washington University, and a visiting researcher at Tampere University. Middleton and his co-investigators were primarily concerned with showing how a custom-built "data-to-music" algorithms could enhance engagement with complex data points. In their research they used data collected from Finnish weather records.

"In a digital world where data gathering and interpretation have become embedded in our daily lives, researchers propose new perspectives for the experience of interpretation," says Middleton.

According to him, the study validated what he calls a 'fourth' dimension in data interpretation through musical characteristics.

"Musical sounds can be a highly engaging art form in terms of pure listening entertainment and, as such, a powerful complement to theater, film, video games, sports, and ballet. Since musical sounds can be highly engaging, this research offers new opportunities to understand and interpret data as well as through our aural senses," Middleton explains.

For instance, imagine a simple one-dimensional view of your heart rate data on graph. Then imagine a three-dimensional view of your heart rate data reflected in numbers, colors, and lines. Now, imagine a fourth dimension in which you can actually listen to that data. The key question in Middleton's research is, which of those displays or dimensions help you understand the data best?

For many people, in particular businesses that rely on data to meet consumer needs, this rigorous validation study shows which musical characteristics contribute the most to engagement with data. As Middleton sees it, the research sets the foundation for using that fourth dimension in data analysis.


Story Source:

Materials provided by Tampere University. Note: Content may be edited for style and length.


Journal Reference:

  1. Jonathan Middleton, Jaakko Hakulinen, Katariina Tiitinen, Juho Hella, Tuuli Keskinen, Pertti Huuskonen, Jeffrey Culver, Juhani Linna, Markku Turunen, Mounia Ziat, Roope Raisamo. Data-to-music sonification and user engagement. Frontiers in Big Data, 2023; 6 DOI: 10.3389/fdata.2023.1206081

Cite This Page:

Tampere University. "Complex data becomes easier to interpret when transformed into music." ScienceDaily. ScienceDaily, 30 October 2023. <www.sciencedaily.com/releases/2023/10/231030110814.htm>.
Tampere University. (2023, October 30). Complex data becomes easier to interpret when transformed into music. ScienceDaily. Retrieved April 26, 2024 from www.sciencedaily.com/releases/2023/10/231030110814.htm
Tampere University. "Complex data becomes easier to interpret when transformed into music." ScienceDaily. www.sciencedaily.com/releases/2023/10/231030110814.htm (accessed April 26, 2024).

Explore More

from ScienceDaily

RELATED STORIES