Return to search

Social Agent: Facial Expression Driver for an e-Nose

This thesis describes that it is possible to drive synthetic emotions of an interface agent with an electronic nose system developed at AASS. The e-Nose can be used for quality control, and the detected distortion from a known smell sensation prototype is interpreted to a 3D-representation of emotional states, which in turn points to a set of pre-defined muscle contractions. This extension of a rule based motivation system, which we call Facial Expression Driver, is incorporated to a model for sensor fusion with active perception, to provide a general design for a more complex system with additional senses. To be consistent with the biologically inspired sensor fusion model a muscle based animated facial model was chosen as a test bed for the expression of current emotion. The social agent’s facial expressions demonstrate its tolerance to the detected distortion in order to manipulate the user to restore the system to functional balance. Only a few of the known projects use chemically based sensing to drive a face in real-time, whether they are virtual characters or animatronics. This work may inspire a future android implementation of a head with electro active polymers as synthetic facial muscles.

Identiferoai:union.ndltd.org:UPSALLA1/oai:DiVA.org:liu-1863
Date January 2003
CreatorsWidmark, Jörgen
PublisherLinköpings universitet, Institutionen för teknik och naturvetenskap, Institutionen för teknik och naturvetenskap
Source SetsDiVA Archive at Upsalla University
LanguageEnglish
Detected LanguageEnglish
TypeStudent thesis, info:eu-repo/semantics/bachelorThesis, text
Formatapplication/pdf
Rightsinfo:eu-repo/semantics/openAccess

Page generated in 0.0022 seconds