Return to search

Perception of Emotion from Facial Expression and Affective Prosody

Real-world perception of emotion results from the integration of multiple cues, most notably facial expression and affective prosody. The use of incongruent emotional stimuli presents an opportunity to study the interaction between sensory modalities. Thirty-seven participants were exposed to audio-visual stimuli (Robins & Schultz, 2004) including angry, fearful, happy, and neutral presentations. Eighty stimuli contain matching emotions and 240 contain incongruent emotional cues. Matching emotions elicited a significant number of correct responses for all four emotions. Sign tests indicated that for most incongruent conditions, participants demonstrated a bias towards the visual modality. Despite these findings, specific incongruent conditions did show evidence of blending. Future research should explore an evolutionary model of facial expression as a means for behavioral adaptation and the possibility of an “emotional McGurk effect” in particular combinations of emotions.

Identiferoai:union.ndltd.org:GEORGIA/oai:digitalarchive.gsu.edu:psych_theses-1016
Date09 June 2006
CreatorsSantorelli, Noelle Turini
PublisherDigital Archive @ GSU
Source SetsGeorgia State University
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourcePsychology Theses

Page generated in 0.0016 seconds