• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Social Agent: Facial Expression Driver for an e-Nose

Widmark, Jörgen January 2003 (has links)
<p>This thesis describes that it is possible to drive synthetic emotions of an interface agent with an electronic nose system developed at AASS. The e-Nose can be used for quality control, and the detected distortion from a known smell sensation prototype is interpreted to a 3D-representation of emotional states, which in turn points to a set of pre-defined muscle contractions. This extension of a rule based motivation system, which we call Facial Expression Driver, is incorporated to a model for sensor fusion with active perception, to provide a general design for a more complex system with additional senses. To be consistent with the biologically inspired sensor fusion model a muscle based animated facial model was chosen as a test bed for the expression of current emotion. The social agent’s facial expressions demonstrate its tolerance to the detected distortion in order to manipulate the user to restore the system to functional balance. Only a few of the known projects use chemically based sensing to drive a face in real-time, whether they are virtual characters or animatronics. This work may inspire a future android implementation of a head with electro active polymers as synthetic facial muscles.</p>
2

Social Agent: Facial Expression Driver for an e-Nose

Widmark, Jörgen January 2003 (has links)
This thesis describes that it is possible to drive synthetic emotions of an interface agent with an electronic nose system developed at AASS. The e-Nose can be used for quality control, and the detected distortion from a known smell sensation prototype is interpreted to a 3D-representation of emotional states, which in turn points to a set of pre-defined muscle contractions. This extension of a rule based motivation system, which we call Facial Expression Driver, is incorporated to a model for sensor fusion with active perception, to provide a general design for a more complex system with additional senses. To be consistent with the biologically inspired sensor fusion model a muscle based animated facial model was chosen as a test bed for the expression of current emotion. The social agent’s facial expressions demonstrate its tolerance to the detected distortion in order to manipulate the user to restore the system to functional balance. Only a few of the known projects use chemically based sensing to drive a face in real-time, whether they are virtual characters or animatronics. This work may inspire a future android implementation of a head with electro active polymers as synthetic facial muscles.

Page generated in 0.0749 seconds