1 |
Children's perception of the emotional content of musicTrunk, Barry January 1981 (has links)
No description available.
|
2 |
A study to investigate the ability of people with and without learning disabilities to express and recognise emotion in the human faceWallis, Alison January 2000 (has links)
No description available.
|
3 |
Judgments of Spontaneous Facial Expressions of Emotion across Cultures and Languages: Testing the Universality ThesisKayyal, Mary Hanna January 2014 (has links)
Thesis advisor: James A. Russell / The claim that certain emotions are universally recognized from facial expressions is based primarily on the study of expressions that were posed. The current study was of spontaneous facial expressions shown by aborigines in Papua New Guinea (Ekman, 1980) -- 18 faces claimed to convey one (or, in the case of blends, two) basic emotions and four faces claimed to show other universal feelings. For each face, ten samples of observers-- South Koreans speaking Korean (n=66), Spaniards speaking Spanish (n=54), Israelis speaking Hebrew (n=60), Chinese speaking English (n=83), Chinese speaking Cantonese (n=64), Japanese speaking English (n=71), Japanese speaking Japanese (n=72), Indians speaking English (n=65), Indians speaking Kannada (n=62), and Indians speaking Hindi (n=120)--rated the degree to which each of the 12 predicted emotions or feelings was conveyed. The modal choice across all ten samples of observers was the predicted label for only 2 (of the 22) faces, predicted to convey exclusively happiness. Observers endorsed the predicted emotion or feeling moderately often (mean=56%), but also denied it moderately often (mean=44%). They also endorsed more than one (or, for blends, two) label(s) in each face - on average, 1.8 of basic emotions and 3.7 of other feelings. There were both similarities and differences across culture and language, but the emotional meaning of a facial expression is not well captured by the predicted label(s) or, indeed, by any single label. / Thesis (PhD) — Boston College, 2014. / Submitted to: Boston College. Graduate School of Arts and Sciences. / Discipline: Psychology.
|
4 |
Child sexual offenders’ recognition of facial affect: are offenders less sensitive to emotions in children?Stevens, Christopher January 2015 (has links)
Understanding the risk factors that contribute to sexual offending against children is an important topic for research. The present study set out to examine whether deficits in emotion recognition might contribute to sexual offending, by testing if child sexual offenders were impaired in their recognition of facial expressions of emotion, particularly with children, relative to non-offender controls. To do this, we tested 49 child sexual offenders and 46 non-offender controls on their ability to recognise facial expressions of emotion using photographs of both adults and children posing emotions from the Radboud Faces Database (Langner et al., 2010). We created continua along six emotion pairs (e.g. happiness-sadness) in 10% increments, from the emotions of sadness, anger, happiness, and fear, with morphing software.
Using signal detection analyses, we found that across the emotion pairs, non-offenders were significantly better able to discriminate between emotions than offenders, although there were no significant differences within individual emotion pairs, and was not significant with either age or level of education as a covariate. When discriminating between fear and anger, non-offenders showed a significant bias towards labeling an emotion as fear when judging male faces, whereas offenders did not, and this difference remained significant with age, level of education and socioeconomic status as covariates. Additionally, both groups showed a strong bias towards labeling an emotion as anger when judging female faces. Thus sexual offenders were more likely to identify anger rather than fear with male faces, suggesting that sexual offenders lack an inhibition against recognising anger in males that non-offenders showed.
Overall, contrary to our predictions, we found no evidence to indicate that child sexual offenders showed a specific deficit in their recognition of emotions in children. However, future research should continue to examine this area and its potential link to recidivism.
|
5 |
Accuracy and Judgment Bias of Low Intensity Emotional Expression Among Individuals with Major DepressionBakerman, Davina 23 April 2013 (has links)
It has been suggested that depressed individuals have difficulties decoding emotional facial expressions in others contributing to a negative cycle of interpersonal difficulties. Some studies have demonstrated global deficits in the processing of emotional facial expressions compared to non-depressed participants, whereas others have noted differences for specific emotions. Methodological issues, including the operationalization of accuracy and bias and the examination of a limited range of emotion and intensity, can partially explain the mixed findings. The aim of the current study was to examine differences in accuracy in the detection of emotional facial expressions in participants with MDD (currently depressed, partially remitted, and those with a lifetime history of MDD) and non-depressed comparisons. Methodological limitations of previous studies were addressed by: (a) using the unbiased hit rate (Wagner, 1993), which is a more precise measure of accuracy for specific emotion, (b) using a more precise measure of judgment bias, taking into account the overuse or underuse of specific emotion categories, (c) including the six basic emotions, and (d) incorporating expressions ranging from 20%-100% intensity. Of secondary interest was to determine whether transient mood state is predictive of accuracy scores regardless of diagnostic status. Thirty-seven depressed and 34 non-depressed participants recruited from the ROHCG Mood Disorders program and the University of Ottawa took part in this study. Clinical status was assessed using the Structured Clinical Interview for the DSM-IV (SCID-IV) and the Beck Depression Inventory-II (BDI-II). Participants also completed the Profile of Mood States-Bipolar (POMS-BI) form to assess mood state at the time of testing. The facial recognition task consisted of happiness, sadness, anger, fear, disgust, and surprise at 20%-100% intensity, presented for 500 ms. Participants pressed a computer key to identify the emotion that was presented. Results indicated that both groups of depressed participants were more accurate than non-depressed participants in detecting anger at 20% intensity. Depressed participants also showed a bias away from surprise. Group differences at high intensity were non-significant, however, participants with current depression and partial remission showed a bias towards anger at 50% intensity. Regression analyses were performed using the POMS-Agreeable Hostile (POMS-AH) and POMS-Elated Depressed (POMS-ED) scales to determine whether mood state was predictive of accuracy in the detection of anger and sadness. Regression models predicting accuracy were non-significant. Results of this study are considered in the context of cognitive and cognitive-interpersonal theories of MDD.
|
6 |
Accuracy and Judgment Bias of Low Intensity Emotional Expression Among Individuals with Major DepressionBakerman, Davina January 2013 (has links)
It has been suggested that depressed individuals have difficulties decoding emotional facial expressions in others contributing to a negative cycle of interpersonal difficulties. Some studies have demonstrated global deficits in the processing of emotional facial expressions compared to non-depressed participants, whereas others have noted differences for specific emotions. Methodological issues, including the operationalization of accuracy and bias and the examination of a limited range of emotion and intensity, can partially explain the mixed findings. The aim of the current study was to examine differences in accuracy in the detection of emotional facial expressions in participants with MDD (currently depressed, partially remitted, and those with a lifetime history of MDD) and non-depressed comparisons. Methodological limitations of previous studies were addressed by: (a) using the unbiased hit rate (Wagner, 1993), which is a more precise measure of accuracy for specific emotion, (b) using a more precise measure of judgment bias, taking into account the overuse or underuse of specific emotion categories, (c) including the six basic emotions, and (d) incorporating expressions ranging from 20%-100% intensity. Of secondary interest was to determine whether transient mood state is predictive of accuracy scores regardless of diagnostic status. Thirty-seven depressed and 34 non-depressed participants recruited from the ROHCG Mood Disorders program and the University of Ottawa took part in this study. Clinical status was assessed using the Structured Clinical Interview for the DSM-IV (SCID-IV) and the Beck Depression Inventory-II (BDI-II). Participants also completed the Profile of Mood States-Bipolar (POMS-BI) form to assess mood state at the time of testing. The facial recognition task consisted of happiness, sadness, anger, fear, disgust, and surprise at 20%-100% intensity, presented for 500 ms. Participants pressed a computer key to identify the emotion that was presented. Results indicated that both groups of depressed participants were more accurate than non-depressed participants in detecting anger at 20% intensity. Depressed participants also showed a bias away from surprise. Group differences at high intensity were non-significant, however, participants with current depression and partial remission showed a bias towards anger at 50% intensity. Regression analyses were performed using the POMS-Agreeable Hostile (POMS-AH) and POMS-Elated Depressed (POMS-ED) scales to determine whether mood state was predictive of accuracy in the detection of anger and sadness. Regression models predicting accuracy were non-significant. Results of this study are considered in the context of cognitive and cognitive-interpersonal theories of MDD.
|
7 |
Facial Expression Intelligence Scale (FEIS): Recognizing and Interpreting Facial Expressions and Implications for Consumer BehaviorPierce, Meghan 02 May 2012 (has links)
Each time we meet a new person, we draw inferences based on our impressions. The first thing we are likely to notice is a person's face. The face functions as one source of information, which we combine with the spoken word, body language, past experience, and the context of the situation to form judgments. Facial expressions serve as pieces of information we use to understand what another person is thinking, saying, or feeling. While there is strong support for the universality of emotion recognition, the ability to identify and interpret facial expressions varies by individual. Existing scales fail to include the dynamicity of the face. Five studies are proposed to examine the viability of the Facial Expression Intelligence Scale (FEIS) to measure individual ability to identify and interpret facial expressions. Consumer behavior implications are discussed. / Ph. D.
|
8 |
Negatively Biased Facial Affect Discernment and Socially Inhibited Behavior in Middle ChildhoodGarcia, Sarah Elizabeth 10 May 2017 (has links)
Negatively biased facial affect discernment may prompt socially inhibited behavior. Characterizing normative patterns of facial affect discernment across emotions and expression intensity during middle childhood will help to identify subtle, yet meaningful, deviations that may emerge for individuals and potentially negatively impact their social behavior. Facial affect discernment for happy, sad, and angry expressions across low, medium, and high intensities and parent-reported socially inhibited behavior were measured in this study in a sample of 7-10 year-old children (N = 80; 53% female). Discernment accuracy improved with increased expression intensity for all emotions. Specifically, we found a quartic effect for the association between intensity and accuracy for anger and negative quadratics effects with decelerating positive rates of changes for associations between intensity and accuracy for happiness and intensity and accuracy for sadness. Additionally, discernment accuracy for happiness was generally better than for sadness and anger; discernment accuracy for anger was generally better than for sadness. However, at low intensity, discernment accuracy for sadness was comparable to accuracy for happiness but better than for anger. Neither misidentification of neutral and low intensity faces as negative nor discernment accuracy of happiness at low intensity was significantly associated with socially inhibited behaviors. Although accurate discernment of anger and sadness at low intensity was not significantly related to socially inhibited behavior, better discernment accuracy of anger and sadness at medium intensity was significantly related to more socially inhibited behavior. Overall, these results enhance understanding of normative facial affect discernment and its relation to maladaptive social behaviors in middle childhood, a developmental stage at which intervention efforts may prove effective at heading off detrimental outcomes associated with socially inhibited behavior such as loneliness, low self-esteem, peer victimization, social anxiety, and depression that increase in late childhood and adolescence.
|
9 |
The role of affective information in context on the judgment of facial expression: in what situations are North Americans influenced by contextual information?Ito, Kenichi Unknown Date
No description available.
|
10 |
Effects of Emotional Expressions on Eye Gaze Discrimination and Attentional CuingLee, Daniel Hyuk-Joon 15 February 2010 (has links)
Recent evidence has shown that our emotional facial expressions evolved to functionally benefit the expression’s sender, in particular fear increasing and disgust decreasing sensory acquisition. Using schematic eyes only that lack emotional content, but taken from actual participant fear and disgust expressions, we examined the functional action resonance hypothesis that adaptive benefits are also conferred to the expression’s receiver. Participants’ eye gaze discrimination was enhanced when viewing wider, “fear” eyes versus narrower, “disgust” eyes (Experiment 1). Using a gaze cuing paradigm, task facilitation by way of faster responses to target was found when viewing wider versus narrower eyes (Experiment 2). Contrary to our hypothesis, a null attention modulation for wider versus narrower eyes was found (Experiments 2 and 3). Nonetheless, the evidence is argued for the functional action resonance hypothesis.
|
Page generated in 0.0767 seconds