• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 87
  • 12
  • 10
  • 9
  • 6
  • 5
  • 3
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 192
  • 192
  • 49
  • 38
  • 36
  • 34
  • 28
  • 26
  • 25
  • 24
  • 23
  • 21
  • 21
  • 19
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Comparable, but atypical, emotion processing in high-functioning children with autism spectrum disorders : evidence from facial emotion recognition and facial emotion imitation

Farkas, Tibor Nandor January 2017 (has links)
The present thesis aimed to examine if children with ASD process emotions comparably to TD children or if they show emotion processing difficulties, with particular focus on the recognition- and imitation of facial emotional expressions and on processing human faces. Furthermore, the thesis sought to contrast the performance of children (both with- and without ASD) with that of neurotypical adult participants to establish the typical level of emotion processing and to investigate if emotion processing capabilities improve with age from childhood to adulthood. Experiment 1 tested the recognition of the six basic emotions (anger, disgust, fear, happiness, sadness and surprise, and also neutrality) under timed conditions, when restricted stimulus presentation length- (1200ms, 200ms, no limit) and increased pressure to respond were introduced (1200ms limit, no limit), as well. In addition, the experiment compared participants’ performance from human facial expressions and from the expressions of animated characters. The Animated Characters Stimulus Set has been developed and validated before the main experiment. The overall performance of children with ASD was comparable to that of TD children, whose superiority only emerged with the introduction of additional task demands through limiting the length of stimuli presentation or applying a temporal restriction on the response window. Using animated characters to present emotions, instead of human actors, however, improved emotion recognition and alleviated the difficulty of additional task demands, especially for children with ASD, when facial expressions were only briefly presented. Experiment 2 tested the effects of face inversion and in-plane rotations (from 0° to 330°, in 30° increments) on the recognition of the six basic emotions (and neutrality). Children with ASD and TD children recognised emotions with comparable accuracy, while neurotypical adults have outperformed the two child groups. Overall, emotion recognition decreased gradually as rotations approached full inversion; although, this pattern was most prominent in typical adults, whereas the emotion recognition of TD children and especially children with ASD varied considerably across rotations. In contrast to adults and TD children, inversion effects were only found in children with ASD when they observed negative- or more complex emotions, thereby showing evidence both for the availability of configural face processing and for the use of feature-based strategies. Experiment 3 tested imitative behaviour by comparing performance on emotional facial expressions (reflecting anger, disgust, fear, happiness, sadness and surprise, and also neutrality), and non-emotional facial gestures and bilateral bodily actions/movements, presented in short video clips. The style of the imitation was also examined (subtle- vs strong stimulus intensity). A video stimulus set was developed and validated for the purpose of the experiment with a series of pilot studies. Results showed that the imitations of children with ASD were less intense than those of TD children and typical adults only when the participants were copying emotional facial expressions but not when they reproduced non-emotional facial and bodily actions. Moreover, children with ASD were less able to copy the style of the presented actions (only for the imitation of emotional facial expressions) than the two neurotypical groups. Overall, the present thesis demonstrated that the emotion processing of children with ASD was consistently comparable to TD children’s, when their performance was contrasted in experimental, facial emotion recognition and face processing tasks, and in a behavioural study, which assessed their imitations of emotional facial expressions. On the other hand, it was also shown that the emotion processing of children with ASD involved atypical features both when they were recognising- and reproducing emotions. Compared to TD children, they showed increased sensitivity to the negative effects of additional task difficulties and their advantage in utilising featural face processing strategies seemed to be greater, as well, while they were less able to imitate the exact style of the presented emotional facial expressions. These findings support a number of theoretical approaches; however, the notion of an early deficit in social motivation seems to be both appealing and promising in studying and developing socio-emotional functioning in ASD as its perspective could be beneficial to reflect on and possibly affect multiple underlying features.
12

Human Emotion Recognition from Body Language of the Head using Soft Computing Techniques

Zhao, Yisu 31 October 2012 (has links)
When people interact with each other, they not only listen to what the other says, they react to facial expressions, gaze direction, and head movement. Human-computer interaction would be enhanced in a friendly and non-intrusive way if computers could understand and respond to users’ body language in the same way. This thesis aims to investigate new methods for human computer interaction by combining information from the body language of the head to recognize the emotional and cognitive states. We concentrated on the integration of facial expression, eye gaze and head movement using soft computing techniques. The whole procedure is done in two-stage. The first stage focuses on the extraction of explicit information from the modalities of facial expression, head movement, and eye gaze. In the second stage, all these information are fused by soft computing techniques to infer the implicit emotional states. In this thesis, the frequency of head movement (high frequency movement or low frequency movement) is taken into consideration as well as head nods and head shakes. A very high frequency head movement may show much more arousal and active property than the low frequency head movement which differs on the emotion dimensional space. The head movement frequency is acquired by analyzing the tracking results of the coordinates from the detected nostril points. Eye gaze also plays an important role in emotion detection. An eye gaze detector was proposed to analyze whether the subject's gaze direction was direct or averted. We proposed a geometrical relationship of human organs between nostrils and two pupils to achieve this task. Four parameters are defined according to the changes in angles and the changes in the proportion of length of the four feature points to distinguish avert gaze from direct gaze. The sum of these parameters is considered as an evaluation parameter that can be analyzed to quantify gaze level. The multimodal fusion is done by hybridizing the decision level fusion and the soft computing techniques for classification. This could avoid the disadvantages of the decision level fusion technique, while retaining its advantages of adaptation and flexibility. We introduced fuzzification strategies which can successfully quantify the extracted parameters of each modality into a fuzzified value between 0 and 1. These fuzzified values are the inputs for the fuzzy inference systems which map the fuzzy values into emotional states.
13

Ecological Factors in Eemotion Recognition using Physiological Signals

Hung, Delbert 08 December 2011 (has links)
To address the feasibility of ambulatory emotion recognition, characteristics of biosignals were compared between sitting and controlled walking using different stimulus modalities. Emotional stimulus items were drawn from the International Affective Pictures System and International Affective Digitized Sounds libraries to elicit five basic emotions. To assess which emotion was elicited, participants (n=15) completed self-report scales using the Self-Assessment Manikin and discrete emotion ratings following the presentation of each stimulus item. Autonomic activity was monitored using electrocardiogram, electrodermal activity, and thoracic and abdominal respiration. Multivariate analysis of variance was employed to test for differences in biosignal features and supervised classifiers were trained to predict the elicited emotion using physiological data. The study revealed differences between sitting and walking states but no effect was found for stimulus modality. Self-reported emotions were poorly predicted using our methodology and a discussion of potential directions and recommendations for future research was presented.
14

Ecological Factors in Eemotion Recognition using Physiological Signals

Hung, Delbert 08 December 2011 (has links)
To address the feasibility of ambulatory emotion recognition, characteristics of biosignals were compared between sitting and controlled walking using different stimulus modalities. Emotional stimulus items were drawn from the International Affective Pictures System and International Affective Digitized Sounds libraries to elicit five basic emotions. To assess which emotion was elicited, participants (n=15) completed self-report scales using the Self-Assessment Manikin and discrete emotion ratings following the presentation of each stimulus item. Autonomic activity was monitored using electrocardiogram, electrodermal activity, and thoracic and abdominal respiration. Multivariate analysis of variance was employed to test for differences in biosignal features and supervised classifiers were trained to predict the elicited emotion using physiological data. The study revealed differences between sitting and walking states but no effect was found for stimulus modality. Self-reported emotions were poorly predicted using our methodology and a discussion of potential directions and recommendations for future research was presented.
15

Dysphoria and facial emotion recognition: Examining the role of rumination

Duong, David January 2012 (has links)
Rumination has been shown to be an influential part of the depressive experience, impacting on various cognitive processes including memory and attention. However, there is a dearth of studies examining the relationship between rumination and emotion recognition, deficits or biases in which have been closely linked to a depressive mood state. In Study 1, participants (N = 89) received either a rumination or distraction induction prior to completing three variants of an emotion recognition task assessing decoding accuracy or biases. Results demonstrated that greater levels of dysphoria were associated with poorer facial emotion recognition accuracy, but only when participants were induced to ruminate (as opposed to being induced to distract). The aim of Study 2 (N = 172) was to examine a possible mechanism, namely cognitive load, by which rumination affects emotion recognition. Results from this study indicated that participants endorsing greater levels of dysphoria were less accurate on an emotion recognition task when they received either a rumination induction or a cognitive load task compared to their counterparts who received a distraction induction. Importantly, the performance of those in the cognitive load and rumination conditions did not differ from each other. In summary, these findings suggest that the confluence of dysphoria and rumination can influence individuals’ accuracy in identifying emotional content portrayed in facial expressions. Furthermore, rumination, by definition an effortful process, might negatively impact emotion recognition via the strain it places on cognitive resources.
16

The Intersection of Working Memory and Emotion Recognition in Autism Spectrum Disorders

Anderson, Sharlet 18 December 2013 (has links)
The present study investigates the intersection of working memory and emotion recognition in young adults with autism spectrum disorders (ASD) and neurotypical controls. The executive functioning theory of autism grounds key impairments within the cognitive realm, whereas social-cognitive theories view social functioning impairments as primary. Executive functioning theory of ASD has been criticized because executive functioning is too broad and is composed of separable, component skills. In the current study, the focus is narrowed to one of those components, working memory. It has been suggested that executive functioning may play a role in effective social interactions. Emotion recognition is an important aspect of social reciprocity, which is impaired in ASD. The current study investigates this hypothesis by combining working memory and emotion recognition into a single task, the n-back, as a model of social interaction and comparing performance between adults with ASD and controls. A validates set of facial expression stimuli (NimStim) was modified to remove all extraneous detail, and type of emotion was tightly controlled across 1-, 2-, and 3-back conditions. Results include significantly lower accuracy in each of the working memory load conditions in the ASD group compared to the control group, as well as in a baseline, maintenance memory task. The control group's reaction time increased as working memory load increased, whereas the ASD group's reaction time did not significantly vary by n-back level. The pattern of results suggests that the limit for n-back with emotional expressions is 2-back, due to near chance level performance in both groups for 3-back, as well as definitive problems in short term memory for facial expressions of emotion in high-functioning individuals with ASD, in contrast to previous findings of near perfect short term memory for facial expressions of emotion in controls.
17

A Comparison of the Recognition of Facial Emotion in Women of Low Body Weight, Both With and Without Anorexia Nervosa

Muir, Karin January 2011 (has links)
Facial expressions can be reliable markers of emotion, and represent an important source of social information. Consequently, the ability to judge facial expressions accurately is essential for successful interpersonal interactions. Anorexia nervosa (AN) is an eating disorder in which social difficulties are common. Past research has suggested that facial emotion recognition may be disturbed in AN, although the precise nature of this disturbance is unclear. The current study aimed to further investigate emotion recognition in AN by comparing 12 women with AN to 21 women who were constitutionally thin (CT) on the Facial Expression Recognition Test, an established computerized test of facial emotion recognition. Still photographs of faces displaying different emotional expressions and neutral expressions were presented to participants via computer screen. Participants were required to decide which emotion each face displayed from several choices. AN subjects responded faster than CT subjects to the facial emotion stimuli, regardless of which emotion was displayed. However, AN subjects did not differ from CT subjects on overall accuracy, accuracy for different emotion categories or misclassifications. Results are discussed in terms of the cognitive style of individuals with AN, recent models of socio-emotional processing, and issues of methodology.
18

Dysphoria and facial emotion recognition: Examining the role of rumination

Duong, David January 2012 (has links)
Rumination has been shown to be an influential part of the depressive experience, impacting on various cognitive processes including memory and attention. However, there is a dearth of studies examining the relationship between rumination and emotion recognition, deficits or biases in which have been closely linked to a depressive mood state. In Study 1, participants (N = 89) received either a rumination or distraction induction prior to completing three variants of an emotion recognition task assessing decoding accuracy or biases. Results demonstrated that greater levels of dysphoria were associated with poorer facial emotion recognition accuracy, but only when participants were induced to ruminate (as opposed to being induced to distract). The aim of Study 2 (N = 172) was to examine a possible mechanism, namely cognitive load, by which rumination affects emotion recognition. Results from this study indicated that participants endorsing greater levels of dysphoria were less accurate on an emotion recognition task when they received either a rumination induction or a cognitive load task compared to their counterparts who received a distraction induction. Importantly, the performance of those in the cognitive load and rumination conditions did not differ from each other. In summary, these findings suggest that the confluence of dysphoria and rumination can influence individuals’ accuracy in identifying emotional content portrayed in facial expressions. Furthermore, rumination, by definition an effortful process, might negatively impact emotion recognition via the strain it places on cognitive resources.
19

Genetic and Parental Influences on the Development of Emotion Recognition Skills in Children

John, Sufna Gheyara 01 August 2014 (has links)
AN ABSTRACT OF THE DISSERTATION OF SUFNA GHEYARA JOHN, for the Doctor of Philosophy degree in Psychology, presented on March 21st, 2014, at Southern Illinois University Carbondale. TITLE: GENETIC AND PARENTAL INFLUENCES ON THE DEVELOPMENT OF EMOTION RECOGNITION SKILLS IN CHILDREN MAJOR PROFESSOR: Dr. Lisabeth DiLalla The purpose of this study was to examine the magnitude of genetic and environmental influences on children's emotion recognition (ER) skills and social difficulties (bullying and victimization). An additional goal was to examine the relation between parent ER skills, child ER skills, and child social difficulties. It was expected that genetic and environmental influences would account for a significant portion of the variance in child ER skills and social difficulties and that child ER skills and social difficulties would share common genetic and environmental influences. Moreover, it was predicted that parent and child ER skills would significantly predict child social difficulties. Finally, it was predicted that child angry and fearful biases in ER abilities would lead to greater social difficulties. 121 children (forming 69 twin pairs) ages 6-10 years and their parents participated in the study. Children and their parents completed an objective measure of ER abilities and subjective measures of child social difficulties. Separate analyses were conducted for child social difficulties by informant (parent or child) and type of difficulty (bullying or victimization). Results from this study suggest that genetic and non-shared environmental influences account for a significant portion of the variance in child ER skills, parent-reported bullying and victimization, and child-reported bullying. Conversely, environmental influences account for a significant portion of the variance in child-reported victimization. Child ER abilities and child-reported bullying shared common genetic influences. Path modeling demonstrated that parent ER skills predicted child ER skills and parent-reported bullying, whereas child ER skills predicted child-reported victimization. Finally, children who demonstrated an angry or fearful bias had greater involvement in bullying and were more victimized. These results underscore the importance of conceptualizing bullying and victimization from a biopsychosocial perspective that incorporates both biological and environmental influences on complex social behavior. Moreover, results in this study varied by informant, suggesting that it is important to gather information from multiple perspectives in order to gain the most comprehensive view of child behavior. Finally, these results suggest that helping children to achieve a more balanced perspective in their emotion recognition abilities may help reduce their involvement in socially maladaptive interactions.
20

Development Of a Multisensorial System For Emotions Recognition

FLOR, H. R. 17 March 2017 (has links)
Made available in DSpace on 2018-08-02T00:00:40Z (GMT). No. of bitstreams: 1 tese_10810_Hamilton Rivera Flor20171019-95619.pdf: 4725252 bytes, checksum: 16042ed4abfc5b07268db9f41baa2a83 (MD5) Previous issue date: 2017-03-17 / Automated reading and analysis of human emotion has the potential to be a powerful tool to develop a wide variety of applications, such as human-computer interaction systems, but, at the same time, this is a very difficult issue because the human communication is very complex. Humans employ multiple sensory systems in emotion recognition. At the same way, an emotionally intelligent machine requires multiples sensors to be able to create an affective interaction with users. Thus, this Master thesis proposes the development of a multisensorial system for automatic emotion recognition. The multisensorial system is composed of three sensors, which allowed exploring different emotional aspects, as the eye tracking, using the IR-PCR technique, helped conducting studies about visual social attention; the Kinect, in conjunction with the FACS-AU system technique, allowed developing a tool for facial expression recognition; and the thermal camera, using the FT-RoI technique, was employed for detecting facial thermal variation. When performing the multisensorial integration of the system, it was possible to obtain a more complete and varied analysis of the emotional aspects, allowing evaluate focal attention, valence comprehension, valence expressions, facial expression, valence recognition and arousal recognition. Experiments were performed with sixteen healthy adult volunteers and 105 healthy children volunteers and the results were the developed system, which was able to detect eye gaze, recognize facial expression and estimate the valence and arousal for emotion recognition, This system also presents the potential to analyzed emotions of people by facial features using contactless sensors in semi-structured environments, such as clinics, laboratories, or classrooms. This system also presents the potential to become an embedded tool in robots to endow these machines with an emotional intelligence for a more natural interaction with humans. Keywords: emotion recognition, eye tracking, facial expression, facial thermal variation, integration multisensorial

Page generated in 0.0993 seconds