Music’s allure lies in its power to stir the emotions. But the relation between the physical properties of an acoustic signal and its emotional impact remains an open area of research. This paper reports the results and possible implications of a pilot study and survey used to construct an emotion index for subjective ratings of music. The dimensions of pleasure and arousal exhibit high reliability. Eighty-five participants’ ratings of 100 song excerpts are used to benchmark the predictive accuracy of several combinations of acoustic preprocessing and statistical learning algorithms. The Euclidean distance between acoustic representations of an excerpt and corresponding emotionweighted visualizations of a corpus of music excerpts provided predictor variables for linear regression that resulted in the highest predictive accuracy of mean pleasure and arousal values of test songs. This new technique also generated visualizations that show how rhythm, pitch, and loudness interrelate to influence our appreciation of the emotional content of music.
Identifer | oai:union.ndltd.org:IUPUI/oai:scholarworks.iupui.edu:1805/2142 |
Date | 25 May 2010 |
Creators | Ough, Stuart G. |
Contributors | MacDorman, Karl F. |
Source Sets | Indiana University-Purdue University Indianapolis |
Language | en_US |
Detected Language | English |
Type | Thesis |
Page generated in 0.0014 seconds