• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 74
  • 11
  • 10
  • 7
  • 5
  • 3
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 163
  • 163
  • 43
  • 33
  • 31
  • 30
  • 24
  • 24
  • 22
  • 21
  • 19
  • 18
  • 16
  • 15
  • 15
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Adult ageing and emotion perception

Lawrie, Louisa January 2018 (has links)
Older adults are worse than young adults at perceiving emotions in others. However, it is unclear why these age-related differences in emotion perception exist. The studies presented in this thesis investigated the cognitive, emotional and motivational factors influencing age differences in emotion perception. Study 1 revealed no age differences in mood congruence effects: sad faces were rated as more sad when participants experienced negative mood. In contrast, Study 2 demonstrated that sad mood impaired recognition accuracy for sad faces. Together, findings suggested that different methods of assessing emotion perception engage the use of discrete processing strategies. These mood influences on emotion perception are similar in young and older adults. Studies 3 and 4 investigated age differences in emotion perception tasks which are more realistic and contextualised than still photographs of facial expressions. Older adults were worse than young at recognising emotions from silent dynamic displays; however, older adults outperformed young in a film task that displayed emotional information in multiple modalities (Study 3). Study 4 suggested that the provision of vocal information was particularly beneficial to older adults. Furthermore, vocabulary mediated the relationship between age and performance on the contextual film task. However, age-related deficits in decoding basic emotions were established in a separate multi-modal video-based task. In addition, age differences in the perception of neutral expressions were also examined. Neutral expressions were interpreted as displaying positive emotions by older adults. Using a dual-task paradigm, Study 5 suggested that working memory processes are involved in decoding emotions. However, age-related declines in working memory were not driving age effects in emotion perception. Neuropsychological, motivational and cognitive explanations for these results are evaluated. Implications of these findings for older adults' social functioning are discussed.
2

Associations between autistic traits and emotion recognition ability in non-clinical young adults

Lindahl, Christina January 2013 (has links)
This study investigated the associations between emotion recognition ability and autistic traits in a sample of non-clinical young adults. Two hundred and forty nine individuals took part in an emotion recognition test, which assessed recognition of 12 emotions portrayed by actors. Emotion portrayals were presented as short video clips, both with and without sound, and as sound only. Autistic traits were assessed using the Autism Spectrum Quotient (ASQ) questionnaire. Results showed that men had higher ASQ scores than women, and some sex differences in emotion recognition were also observed. The main finding was that autistic traits were correlated with several measures of emotion recognition. More specifically, ASQ-scores were negatively correlated with recognition of fear and with recognition of ambiguous stimuli.
3

Emotion Recognition Using Glottal and Prosodic Features

Iliev, Alexander Iliev 21 December 2009 (has links)
Emotion conveys the psychological state of a person. It is expressed by a variety of physiological changes, such as changes in blood pressure, heart beat rate, degree of sweating, and can be manifested in shaking, changes in skin coloration, facial expression, and the acoustics of speech. This research focuses on the recognition of emotion conveyed in speech. There were three main objectives of this study. One was to examine the role played by the glottal source signal in the expression of emotional speech. The second was to investigate whether it can provide improved robustness in real-world situations and in noisy environments. This was achieved through testing in clear and various noisy conditions. Finally, the performance of glottal features was compared to diverse existing and newly introduced emotional feature domains. A novel glottal symmetry feature is proposed and automatically extracted from speech. The effectiveness of several inverse filtering methods in extracting the glottal signal from speech has been examined. Other than the glottal symmetry, two additional feature classes were tested for emotion recognition domains. They are the: Tonal and Break Indices (ToBI) of American English intonation, and Mel Frequency Cepstral Coefficients (MFCC) of the glottal signal. Three corpora were specifically designed for the task. The first two investigated the four emotions: Happy, Angry, Sad, and Neutral, and the third added Fear and Surprise in a six emotions recognition task. This work shows that the glottal signal carries valuable emotional information and using it for emotion recognition has many advantages over other conventional methods. For clean speech, in a four emotion recognition task using classical prosodic features achieved 89.67% recognition, ToBI combined with classical features, reached 84.75% recognition, while using glottal symmetry alone achieved 98.74%. For a six emotions task these three methods achieved 79.62%, 90.39% and 85.37% recognition rates, respectively. Using the glottal signal also provided greater classifier robustness under noisy conditions and distortion caused by low pass filtering. Specifically, for additive white Gaussian noise at SNR = 10 dB in the six emotion task the classical features and the classical with ToBI both failed to provide successful results; speech MFCC's achieved a recognition rate of 41.43% and glottal symmetry reached 59.29%. This work has shown that the glottal signal, and the glottal symmetry in particular, provides high class separation for both the four and six emotion cases. It is confidently surpassing the performance of all other features included in this investigation in noisy speech conditions and in most clean signal conditions.
4

Recognizing emotions from facial expressions : a computer-assisted video intervention for young children with Asperger syndrome

Garrison, Daniel Alexander 25 July 2011 (has links)
The effective encoding and interpretation of facial expressions is critical to inferring the intentions, motivation, and emotional state of others. Asperger syndrome (AS) is a pervasive, neurodevelopmental condition characterized by significant deficits in social interaction, impaired use of language, and stereotyped interests and activities. Deficient encoding and interpretation of facial expressions is likely related to the social difficulties experienced by those with AS. A video-based intervention administered via Internet is proposed for young children with AS. This research hopes to clarify the questions (1) are young children with AS able to interpret simple emotions and (2) can they learn the skills necessary to interpret complex emotions. Data will be analyzed using multivariate analysis of covariance. / text
5

The Role of Body Mass Index and its Covariates in Emotion Recognition

Miller, Angela Nicole Roberts 10 July 2013 (has links)
No description available.
6

Emotion Recognition of Dynamic Faces in Children with Autism Spectrum Disorder

Ostmeyer-Kountzman, Katrina 08 June 2012 (has links)
Studies examining impaired emotion recognition and perceptual processing in autism spectrum disorders (ASD) show inconsistent results (Harms, Martin, & Wallace, 2010; Jemel, Mottron, & Dawson, 2006), and many of these studies include eye tracking data. The current study utilizes a novel task, emotion recognition of a dynamic talking face with sound, to compare children with ASD (n=8; aged 6-10, 7 male) with mental age (MA) and gender matched controls (n=8; aged 4-10, 7 male) on an emotion identification and eye tracking task. Children were asked to watch several short video clips (2.5-5 seconds) portraying the emotions of happy, sad, excited, scared, and angry and identify the emotion portrayed in the video. A mixed factorial ANOVA analysis was conducted to examine group differences in attention when viewing the stimuli. Differences in emotion identification ability were examined using a t-test and Fisher's exact tests of independence. Findings indicated that children with ASD spent less time looking at faces and the mouth region than controls. Additionally, the amount of time children with ASD spent looking at the mouth region predicted better performance on the emotion identification task. The study was underpowered; however, so these results were preliminary and require replication. Results are discussed in relation to natural processing of emotion and social stimuli. <i>[revised ETD per Dean DePauw 10/25/12 GMc]</i> / Master of Science
7

An investigation of cultural variations in emotion experience, regulation and expression in two Scottish settings

Donnan, Gemma Louise Jean January 2017 (has links)
Individuals from Aberdeen/Aberdeenshire and Glasgow/Greater Glasgow have anecdotally been thought to differ in their expression of emotion with the former group being thought to be less emotionally expressive that the latter. The current thesis carried out three studies to empirically examine this. A systematic review of measures of emotion experience, regulation, expression and alexithymia was carried out to establish their psychometric properties. The results of the review lead to recommendations for which scales to use within future studies of the thesis. The second study used measures of emotion experience (Positive Affect Negative Affect Schedule), emotion regulation (Emotion Regulation Questionnaire) and alexithymia (Toronto Alexithymia Scale-20), identified within the review, in samples of adults from Aberdeen/Aberdeenshire and Glasgow/Greater Glasgow. A multiple indicators multiple causes model was used to examine group differences in response to these measures, this method allowed examination of differences on factor means and individual indicator items on the scales. It was found that Aberdeen/Aberdeenshire participants demonstrated a higher factor mean on the Negative Affect (NA) factor of the PANAS; the Aberdeen/Aberdeenshire participants also endorsed an individual item on the ERQ (Item 5) and the TAS-20 (Item 1) more than the Glasgow/Greater Glasgow participants. Finally, a qualitative study was carried out in which participants from each group recalled events related to six emotions. In describing events related to fear, anger and sadness, Aberdeen/Aberdeenshire participants tended to use positive statements that downplayed events related to these emotions, while the Glasgow/Greater Glasgow participants tended to use 'catastrophic' statements when describing events related to the same emotions. This may indicate differing cultural models between these populations.
8

Comparable, but atypical, emotion processing in high-functioning children with autism spectrum disorders : evidence from facial emotion recognition and facial emotion imitation

Farkas, Tibor Nandor January 2017 (has links)
The present thesis aimed to examine if children with ASD process emotions comparably to TD children or if they show emotion processing difficulties, with particular focus on the recognition- and imitation of facial emotional expressions and on processing human faces. Furthermore, the thesis sought to contrast the performance of children (both with- and without ASD) with that of neurotypical adult participants to establish the typical level of emotion processing and to investigate if emotion processing capabilities improve with age from childhood to adulthood. Experiment 1 tested the recognition of the six basic emotions (anger, disgust, fear, happiness, sadness and surprise, and also neutrality) under timed conditions, when restricted stimulus presentation length- (1200ms, 200ms, no limit) and increased pressure to respond were introduced (1200ms limit, no limit), as well. In addition, the experiment compared participants’ performance from human facial expressions and from the expressions of animated characters. The Animated Characters Stimulus Set has been developed and validated before the main experiment. The overall performance of children with ASD was comparable to that of TD children, whose superiority only emerged with the introduction of additional task demands through limiting the length of stimuli presentation or applying a temporal restriction on the response window. Using animated characters to present emotions, instead of human actors, however, improved emotion recognition and alleviated the difficulty of additional task demands, especially for children with ASD, when facial expressions were only briefly presented. Experiment 2 tested the effects of face inversion and in-plane rotations (from 0° to 330°, in 30° increments) on the recognition of the six basic emotions (and neutrality). Children with ASD and TD children recognised emotions with comparable accuracy, while neurotypical adults have outperformed the two child groups. Overall, emotion recognition decreased gradually as rotations approached full inversion; although, this pattern was most prominent in typical adults, whereas the emotion recognition of TD children and especially children with ASD varied considerably across rotations. In contrast to adults and TD children, inversion effects were only found in children with ASD when they observed negative- or more complex emotions, thereby showing evidence both for the availability of configural face processing and for the use of feature-based strategies. Experiment 3 tested imitative behaviour by comparing performance on emotional facial expressions (reflecting anger, disgust, fear, happiness, sadness and surprise, and also neutrality), and non-emotional facial gestures and bilateral bodily actions/movements, presented in short video clips. The style of the imitation was also examined (subtle- vs strong stimulus intensity). A video stimulus set was developed and validated for the purpose of the experiment with a series of pilot studies. Results showed that the imitations of children with ASD were less intense than those of TD children and typical adults only when the participants were copying emotional facial expressions but not when they reproduced non-emotional facial and bodily actions. Moreover, children with ASD were less able to copy the style of the presented actions (only for the imitation of emotional facial expressions) than the two neurotypical groups. Overall, the present thesis demonstrated that the emotion processing of children with ASD was consistently comparable to TD children’s, when their performance was contrasted in experimental, facial emotion recognition and face processing tasks, and in a behavioural study, which assessed their imitations of emotional facial expressions. On the other hand, it was also shown that the emotion processing of children with ASD involved atypical features both when they were recognising- and reproducing emotions. Compared to TD children, they showed increased sensitivity to the negative effects of additional task difficulties and their advantage in utilising featural face processing strategies seemed to be greater, as well, while they were less able to imitate the exact style of the presented emotional facial expressions. These findings support a number of theoretical approaches; however, the notion of an early deficit in social motivation seems to be both appealing and promising in studying and developing socio-emotional functioning in ASD as its perspective could be beneficial to reflect on and possibly affect multiple underlying features.
9

Dysphoria and facial emotion recognition: Examining the role of rumination

Duong, David January 2012 (has links)
Rumination has been shown to be an influential part of the depressive experience, impacting on various cognitive processes including memory and attention. However, there is a dearth of studies examining the relationship between rumination and emotion recognition, deficits or biases in which have been closely linked to a depressive mood state. In Study 1, participants (N = 89) received either a rumination or distraction induction prior to completing three variants of an emotion recognition task assessing decoding accuracy or biases. Results demonstrated that greater levels of dysphoria were associated with poorer facial emotion recognition accuracy, but only when participants were induced to ruminate (as opposed to being induced to distract). The aim of Study 2 (N = 172) was to examine a possible mechanism, namely cognitive load, by which rumination affects emotion recognition. Results from this study indicated that participants endorsing greater levels of dysphoria were less accurate on an emotion recognition task when they received either a rumination induction or a cognitive load task compared to their counterparts who received a distraction induction. Importantly, the performance of those in the cognitive load and rumination conditions did not differ from each other. In summary, these findings suggest that the confluence of dysphoria and rumination can influence individuals’ accuracy in identifying emotional content portrayed in facial expressions. Furthermore, rumination, by definition an effortful process, might negatively impact emotion recognition via the strain it places on cognitive resources.
10

The Intersection of Working Memory and Emotion Recognition in Autism Spectrum Disorders

Anderson, Sharlet 18 December 2013 (has links)
The present study investigates the intersection of working memory and emotion recognition in young adults with autism spectrum disorders (ASD) and neurotypical controls. The executive functioning theory of autism grounds key impairments within the cognitive realm, whereas social-cognitive theories view social functioning impairments as primary. Executive functioning theory of ASD has been criticized because executive functioning is too broad and is composed of separable, component skills. In the current study, the focus is narrowed to one of those components, working memory. It has been suggested that executive functioning may play a role in effective social interactions. Emotion recognition is an important aspect of social reciprocity, which is impaired in ASD. The current study investigates this hypothesis by combining working memory and emotion recognition into a single task, the n-back, as a model of social interaction and comparing performance between adults with ASD and controls. A validates set of facial expression stimuli (NimStim) was modified to remove all extraneous detail, and type of emotion was tightly controlled across 1-, 2-, and 3-back conditions. Results include significantly lower accuracy in each of the working memory load conditions in the ASD group compared to the control group, as well as in a baseline, maintenance memory task. The control group's reaction time increased as working memory load increased, whereas the ASD group's reaction time did not significantly vary by n-back level. The pattern of results suggests that the limit for n-back with emotional expressions is 2-back, due to near chance level performance in both groups for 3-back, as well as definitive problems in short term memory for facial expressions of emotion in high-functioning individuals with ASD, in contrast to previous findings of near perfect short term memory for facial expressions of emotion in controls.

Page generated in 0.0472 seconds