Return to search

3D facial expression classification using a statistical model of surface normals and a modular approach

Following the success in 3D face recognition, the face processing community is now trying to establish good 3D facial expression recognition. Facial expressions provide the cues of communication in which we can interpret the mood, meaning and emotions at the same time. With current advanced 3D scanners technology, direct anthropometric measurements (i.e. the comparative study of sizes and proportions of the human body) are easily obtainable and it offers 3D geometrical data suitable for 3D face processing studies. Instead of using the raw 3D facial points, we extracted its derivative which gives us 3D facial surface normals. We constructed a statistical model for variations in facial shape due to changes in six basic expressions using 3D facial surface normals as the feature vectors. In particular, we are interested in how such facial expression variations manifest themselves in terms of changes in the field of 3D facial surface normals. We employed a modular approach where a module contains the facial features of a distinct facial region. The decomposition of a face into several modules promotes the learning of a facial local structure and therefore the most discriminative variation of the facial features in each module is emphasized. We decomposed a face into six modules and the expression classification for each module is carried out independently. We constructed a Weighted Voting Scheme (WVS) to infer the emotion underlying a collection of modules using a weight that is determined using the AdaBoost learning algorithm. Using our approach, using 3D facial surface normal as the feature vector of WVS yields a better performance than 3D facial points and 3D distance measurements in facial expression classification using both WVS and Majority Voting Scheme (MVS). The attained results suggest surface normals do indeed produce a comparable result particularly for six basic facial expressions with no intensity information.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:575637
Date January 2013
CreatorsUjir, Hamimah
PublisherUniversity of Birmingham
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://etheses.bham.ac.uk//id/eprint/4371/

Page generated in 0.0155 seconds