Return to search

Classification of emotion using sub audible frequencies in vocal data

Master of Science / Department of Computing and Information Sciences / David A. Gustafson / Current research involving vocal emotion detection has taken a variety of different approaches, and has found certain acoustic attributes to characterize different emotional states. While there have been improvements in classification over the past few years, computer classification is not nearly as accurate as human classification. This paper proposes the existence of an attribute that has not been examined, which can be used as a measure for detecting emotion in human vocal samples. It is shown that the new infrasonic attribute is significant when examining agitated emotions. Therefore, it can be used to help improve vocal emotion detection.

  1. http://hdl.handle.net/2097/776
Identiferoai:union.ndltd.org:KSU/oai:krex.k-state.edu:2097/776
Date January 1900
CreatorsNarber, Cody G.
PublisherKansas State University
Source SetsK-State Research Exchange
Languageen_US
Detected LanguageEnglish
TypeThesis

Page generated in 0.0016 seconds