Return to search

Popular Music Analysis: Chorus and Emotion Detection

In this thesis, a chorus detection and an emotion detection algorithm for popular
music are proposed. First, a popular music is decomposed into chorus and verse
segments based on its color representation and MFCCs (Mel-frequency cepstral
coefficients). Four features including intensity, tempo and rhythm regularity are
extracted from these structured segments for emotion detection. The emotion of a
song is classified into four classes of emotions: happy, angry, depressed and relaxed
via two classification methods. One is back-propagation neural network classifier and
the other is Adaboost classifier. A test database consisting of 350 popular music songs
is utilized in our experiment. Experimental results show that the average recall and
precision of the proposed chorus detection are approximated to 95% and 84%,
respectively; the average precision rate of emotion detection is 86% for neural
network classifier and 92% for Adaboost classifier. The emotions of a song with
different cover versions are also detected in our experiment. The precision rate is
92%.

Identiferoai:union.ndltd.org:NSYSU/oai:NSYSU:etd-0816110-123828
Date16 August 2010
CreatorsLin, Yu-Dun
ContributorsZi-Tsan Chou, Ming-Sui Lee, Chia-Hung Yeh, Jih-ching Chiu, Tsang-Ling Sheu
PublisherNSYSU
Source SetsNSYSU Electronic Thesis and Dissertation Archive
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
Sourcehttp://etd.lib.nsysu.edu.tw/ETD-db/ETD-search/view_etd?URN=etd-0816110-123828
Rightsnot_available, Copyright information available at source archive

Page generated in 0.0019 seconds