Return to search

利用生理感測資料之線上情緒辨識系統 / On-line Emotion Recognition System by Physiological Signals

貼心的智慧型生活環境,必須能在不同的情緒狀態提供適當服務,因此我們希望能開發出一個情緒辨識系統,透過對於形於外的生理感測資料的變化來觀察形於內的情緒狀態。
首先我們採用國際情緒圖庫系統(IAPS: International Affective Picture System) 及維度式分析方法,透過心理實驗的操弄,收集了20位的受測者生理數值與主觀評定情緒的強度與正負向。我們提出了一個情緒辨識學習演算法,經由交叉驗證訓練出每個情緒的特徵,並藉由即時測試資料來修正情緒特徵的個人化,經由學習趨勢的評估,準確率有明顯提升。其次,我們更進一步引用了維度式與類別式情緒的轉換概念來驗證受測者主觀評定的結果。相較於相關研究實驗結果,我們在維度式上的強度與正負向辨識率有較高的表現,在類別式上的驗證我們也達到明顯區分效果。
更重要的是,我們所實作出的系統,是搭載了無線生理感測器,使用時更具行動性,而且可即時反映情緒,提供線上智慧型服務。 / A living smart environment should be able to provide thoughtful services by considering different states of emotions. The goal of our research is to develop an emotion recognition system which can detect the internal emotion states from external varieties of physiological data.
First we applied the dimensional analysis approach and adopted IAPS (International Affective Picture System) to manipulate psychological experiments. We collected physiological data and subjective ratings for arousal and valence from 20 subjects. We proposed an emotion recognition learning algorithm. It would extract each pattern of emotions from cross validation training and can further learn adaptively by feeding personalized testing data. We measured the learning trend of each subject. The recognition rate reveals incremental enhancement. Furthermore, we adopted a dimensional to discrete emotion transforming concept for validating the subjective rating. Compared to the experiment results of related works, our system outperforms both in dimensional and discrete analyses.
Most importantly, the system is implemented based on wireless physiological sensors for mobile usage. This system can reflect the image of emotion states in order to provide on-line smart services.

Identiferoai:union.ndltd.org:CHENGCHI/G0095753038
Creators陳建家, Chen, Jian Jia
Publisher國立政治大學
Source SetsNational Chengchi University Libraries
Language英文
Detected LanguageEnglish
Typetext
RightsCopyright © nccu library on behalf of the copyright holders

Page generated in 0.0023 seconds