Return to search

Detecting Emotional Response to Music using Near-infrared Spectroscopy of the Prefrontal Cortex

Many individuals with severe motor disabilities may not be able to use conventional means of emotion expression (e.g. vocalization, facial expression) to make their emotions known to others. Lack of a means for expressing emotions may adversely affect the quality of life of these individuals and their families. The main objective of this thesis was to implement a non-invasive means of identifying emotional arousal (neutral vs. intense) and valence (positive vs. negative) by directly using brain activity. In this light, near infrared spectroscopy (NIRS), which optically measures oxygenated and deoxygenated hemoglobin concentrations ([HbO2] and [Hb], respectively), was used to monitor prefrontal cortex hemodynamics in 10 individuals as they listened to music excerpts. Participants provided subjective ratings of arousal and valence. With respect to valence and arousal, prefrontal cortex [HbO2] and [Hb] were characterized and significant prefrontal cortex hemodynamic modulations were identified due to emotions. These modulations were not significantly related to the characteristics of the music excerpts used for inducing emotions. These early investigations provided evidence for the use of prefrontal cortex NIRS in identifying emotions. Next, using features extracted from [HbO2] and [Hb] in the prefrontal cortex, an average accuracy of 71% was achieved in identifying arousal and valence. Novel hemodynamic features extracted using dynamic modeling and
template-matching were introduced for identifying arousal and valence. Ultimately, the ability of autonomic nervous system (ANS) signals including heart rate, electrodermal activity and skin temperature to improve the identification results, achieved when using PFC [HbO2] and [Hb] exclusively, was investigated. For the majority of the participants, prefrontal cortex NIRS-based identification achieved higher classification accuracies than combined ANS and NIRS features. The results indicated that NIRS recordings of the prefrontal cortex during presentation of music with emotional content can be automatically decoded in terms of both valence and arousal encouraging future investigation of NIRS-based emotion detection in individuals with severe disabilities.

Identiferoai:union.ndltd.org:TORONTO/oai:tspace.library.utoronto.ca:1807/65520
Date20 June 2014
CreatorsSaba, Moghimi
ContributorsTom, Chau, Anne Marie, Guerguerian
Source SetsUniversity of Toronto
Languageen_ca
Detected LanguageEnglish
TypeThesis

Page generated in 0.0022 seconds