Return to search

Facial emotion expression, recognition and production of varying intensity in the typical population and on the autism spectrum

The current research project aimed to investigate facial emotion processing from especially developed and validated video stimuli of facial emotional expressions including varying levels of intensity. Therefore, videos were developed showing real people expressing emotions in real time (anger, disgust, fear, sadness, surprise, happiness, contempt, embarrassment, contempt, and neutral) at different expression intensity levels (low, intermediate, high) called the Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations (ADFES-BIV). The ADFES-BIV was validated on all its emotion and intensity categories. Sex differences in facial emotion recognition were investigated and a female advantage in facial emotion recognition was found compared to males. This demonstrates that the ADFES-BIV is suitable for investigating group comparisons in facial emotion recognition in the general population. Facial emotion recognition from the ADFES-BIV was further investigated in a sample of individuals that is characterised by deficits in social functioning; individuals with an Autism Spectrum Disorder (ASD). A deficit in facial emotion recognition was found in ASD compared to controls and error analysis revealed emotion-specific deficits in detecting emotional content from faces (sensitivity) next to deficits in differentiating between emotions from faces (specificity). The ADFES-BIV was combined with face electromyogram (EMG) to investigate facial mimicry and the effects of proprioceptive feedback (from explicit imitation and blocked facial mimicry) on facial emotion recognition. Based on the reverse simulation model it was predicted that facial mimicry would be an active component of the facial emotion recognition process. Experimental manipulations of face movements did not reveal an advantage of facial mimicry compared to the blocked facial mimicry condition. Whereas no support was found for the reverse simulation model, enhanced proprioceptive feedback can facilitate or hinder recognition of facial emotions in line with embodied cognition accounts.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:704810
Date January 2016
CreatorsWingenbach, Tanja
ContributorsBrosnan, Mark ; Ashwin, Christopher
PublisherUniversity of Bath
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation

Page generated in 0.002 seconds