• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 71
  • 35
  • 16
  • 8
  • 5
  • 4
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 176
  • 176
  • 59
  • 36
  • 32
  • 31
  • 30
  • 25
  • 22
  • 22
  • 21
  • 18
  • 18
  • 17
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
31

FaceMaze: An Embodied Cognition Approach To Facial Expression Production in Autism Spectrum Disorder

Gordon, Iris 25 August 2014 (has links)
Individuals with Autism Spectrum Disorder (ASD) are typified by deficits in social communication, including flat and disorganized affect. Previous research investigating affect production in ASD has demonstrated that individuals on the spectrum show impairments in posing, but not mimicking facial expressions. These findings thus point to a deficit in ASD individuals’ integration of sensory/motor facets in the cognitive representation of a facial expression, and not a deficit in motor or sensory ability. The goal of the current project was to validate a computer-based intervention that targets facial expression production using methods ground in embodied cognition to connect between the sensory and motor facets of facial displays. The “FaceMaze” is a pac-man like game in which players navigate through a maze of obstacles, and are required to produce high-quality facial expressions in order to overcome obstacles. FaceMaze relies on the Computer Expression Recognition Toolbox (CERT) program, which analyzes user’s real-time facial expressions and provides feedback based on the Facial Action Coding System (FACS). In the first part of this project, the FaceMaze was validated using a typically developing (TD) adult population. In Experiment 1, participants were prompted to produce expressions of “Happy”, “Angry” and “Surprise” before and after playing FaceMaze. Electromyography (EMG) analysis targeted three expression-specific facial muscles: Zygomaticus Major (ZM, Happy), Corrugator Supercilii (CS, Angry) and Obicularis Oculi (OO, Surprise). Results showed that relative to pre-game productions, an increase in activation in the ZM for happy expressions, and an increase in CS response for angry expressions was observed after playing the corresponding version of FaceMaze. Critically, no change in muscle activity for the control expression “Surprise” was observed. In Experiment 2, the perceived quality of facial expressions after FaceMaze/ CERT training was compared to those produced after traditional FACS training. “Happy,” “Angry” and “Surprise” expressions were videotaped before and after the FaceMaze game and FACS training, and productions were assessed by a group of naïve raters. Whereas observers rated post-Happy expressions as happier for both FaceMaze and FACS, only the post-Angry expressions in the FaceMaze condition were rated as angrier and less happy after training. In the second half of this project, the efficacy of the FaceMaze was validated by children with ASD, and age- and IQ-matched, typically developing (TD) controls. In Experiment 3 (in press), children were asked to pose “Happy “, “Angry”, and “Surprise” expressions before and after game-play. Expressions were video-recorded and presented to naïve raters who were required to assess video-clips on expression quality. Findings show that the ASD groups’ post-FaceMaze “Happy” and “Angry” expressions were higher in quality than their pre-FaceMaze productions. TD children also showed higher expression quality ratings for the “Angry” expression post-gameplay, but no enhancement of the “Happy” expression was found after FaceMaze. Moreover, the ASD groups’ post-FaceMaze expressions were rated as equal in quality to those of the TD group. These findings not only underscore the fidelity of the FaceMaze game in enhancing facial expression production, but also provide support for a theory of disordered embodied cognition in ASD. / Graduate / igordon@uvic.ca
32

Changes in the Neural Bases of Emotion Regulation Associated with Clinical Improvement in Children with Anxiety Disorders

Hum, Kathryn 13 December 2012 (has links)
Background: The present study was designed to examine prefrontal cortical processes in anxious children that mediate cognitive regulation in response to emotion-eliciting stimuli, and the changes that occur after anxious children participate in a cognitive behavioral therapy treatment program. Methods: Electroencephalographic activity was recorded from clinically anxious children and typically developing children at pre- and post-treatment sessions. Event-related potential components were recorded while children performed a go/no-go task using facial stimuli depicting angry, calm, and happy expressions. Results: At pre-treatment, anxious children had significantly greater posterior P1 and frontal N2 amplitudes than typically developing children, components associated with attention/arousal and cognitive control, respectively. For the anxious group only, there were no differences in neural activation between face (emotion) types or trial (Go vs. No-go) types. Anxious children who did not improve with treatment showed increased cortical activation within the time window of the P1 at pre-treatment relative to comparison and improver children. From pre- to post-treatment, only anxious children who improved with treatment showed increased cortical activation within the time window of the N2. Conclusions: At pre-treatment, anxious children appeared to show increased cortical activation regardless of the emotional content of the stimuli. Anxious children also showed greater medial-frontal activity regardless of task demands and response accuracy. These findings suggest indiscriminate cortical processes that may underlie the hypervigilant regulatory style seen in clinically anxious individuals. Neural activation patterns following treatment suggest that heightened perceptual vigilance, as represented by increased P1 amplitudes for non-improvers, may have prevented these anxious children from learning the treatment strategies, leading to poorer outcomes. Increased cognitive control, as represented by increased N2 amplitudes for improvers, may have enabled these anxious children to implement treatment strategies more effectively, leading to improved treatment outcomes. Hence, P1 activation may serve as a predictor of treatment outcome, while N2 activation may serve as an indicator of treatment-related outcome. These findings point to the cortical processes that maintain maladaptive functioning versus the cortical processes that underlie successful intervention in clinically anxious children.
33

Changes in the Neural Bases of Emotion Regulation Associated with Clinical Improvement in Children with Anxiety Disorders

Hum, Kathryn 13 December 2012 (has links)
Background: The present study was designed to examine prefrontal cortical processes in anxious children that mediate cognitive regulation in response to emotion-eliciting stimuli, and the changes that occur after anxious children participate in a cognitive behavioral therapy treatment program. Methods: Electroencephalographic activity was recorded from clinically anxious children and typically developing children at pre- and post-treatment sessions. Event-related potential components were recorded while children performed a go/no-go task using facial stimuli depicting angry, calm, and happy expressions. Results: At pre-treatment, anxious children had significantly greater posterior P1 and frontal N2 amplitudes than typically developing children, components associated with attention/arousal and cognitive control, respectively. For the anxious group only, there were no differences in neural activation between face (emotion) types or trial (Go vs. No-go) types. Anxious children who did not improve with treatment showed increased cortical activation within the time window of the P1 at pre-treatment relative to comparison and improver children. From pre- to post-treatment, only anxious children who improved with treatment showed increased cortical activation within the time window of the N2. Conclusions: At pre-treatment, anxious children appeared to show increased cortical activation regardless of the emotional content of the stimuli. Anxious children also showed greater medial-frontal activity regardless of task demands and response accuracy. These findings suggest indiscriminate cortical processes that may underlie the hypervigilant regulatory style seen in clinically anxious individuals. Neural activation patterns following treatment suggest that heightened perceptual vigilance, as represented by increased P1 amplitudes for non-improvers, may have prevented these anxious children from learning the treatment strategies, leading to poorer outcomes. Increased cognitive control, as represented by increased N2 amplitudes for improvers, may have enabled these anxious children to implement treatment strategies more effectively, leading to improved treatment outcomes. Hence, P1 activation may serve as a predictor of treatment outcome, while N2 activation may serve as an indicator of treatment-related outcome. These findings point to the cortical processes that maintain maladaptive functioning versus the cortical processes that underlie successful intervention in clinically anxious children.
34

Sensitivity to Emotion Specified in Facial Expressions and the Impact of Aging and Alzheimer's Disease

McLellan, Tracey Lee January 2008 (has links)
This thesis describes a program of research that investigated the sensitivity of healthy young adults, healthy older adults and individuals with Alzheimer’s disease (AD) to happiness, sadness and fear emotion specified in facial expressions. In particular, the research investigated the sensitivity of these individuals to the distinctions between spontaneous expressions of emotional experience (genuine expressions) and deliberate, simulated expressions of emotional experience (posed expressions). The specific focus was to examine whether aging and/or AD effects sensitivity to the target emotions. Emotion-categorization and priming tasks were completed by all participants. The tasks employed an original set of cologically valid facial displays generated specifically for the present research. The categorization task (Experiments 1a, 2a, 3a, 4a) required participants to judge whether targets were, or were not showing and feeling each target emotion. The results showed that all 3 groups identified a genuine expression as both showing and feeling the target emotion whilst a posed expression was identified more frequently as showing than feeling the emotion. Signal detection analysis demonstrated that all 3 groups were sensitive to the expression of emotion, reliably differentiating expressions of experienced emotion (genuine expression) from expressions unrelated to emotional experience (posed and neutral expressions). In addition, both healthy young and older adults could reliably differentiate between posed and genuine expressions of happiness and sadness, whereas, individuals with AD could not. Sensitivity to emotion specified in facial expressions was found to be emotion specific and to be independent of both the level of general cognitive functioning and of specific cognitive functions. The priming task (Experiments 1b, 2b, 3b,4b) employed the facial expressions as primes in a word valence task in order to investigate spontaneous attention to facial expression. Healthy young adults only showed an emotion-congruency priming effect for genuine expressions. Healthy older adults and individuals with AD showed no priming effects. Results are discussed in terms of the understanding of the recognition of emotional states in others and the impact of aging and AD on the recognition of emotional states. Consideration is given to how these findings might influence the care and management of individuals with AD.
35

A Real Time Facial Expression Recognition System Using Deep Learning

Miao, Yu 27 November 2018 (has links)
This thesis presents an image-based real-time facial expression recognition system that is capable of recognizing basic facial expressions of several subjects simultaneously from a webcam. Our proposed methodology combines a supervised transfer learning strategy and a joint supervision method with a new supervision signal that is crucial for facial tasks. A convolutional neural network (CNN) model, MobileNet, that contains both accuracy and speed is deployed in both offline and real-time frameworks to enable fast and accurate real-time output. Evaluations for both offline and real-time experiments are provided in our work. The offline evaluation is carried out by first evaluating two publicly available datasets, JAFFE and CK+, and then presenting the results of the cross-dataset evaluation between these two datasets to verify the generalization ability of the proposed method. A comprehensive evaluation configuration for the CK+ dataset is given in this work, providing a baseline for a fair comparison. It reaches an accuracy of 95.24% on JAFFE dataset, and an accuracy of 96.92% on 6-class CK+ dataset which only contains the last frames of image sequences. The resulting average run-time cost for recognition in the real-time implementation is reported, which is approximately 3.57 ms/frame on an NVIDIA Quadro K4200 GPU. The results demonstrate that our proposed CNN-based framework for facial expression recognition, which does not require a massive preprocessing module, can not only achieve state-of-art accuracy on these two datasets but also perform the classification task much faster than a conventional machine learning methodology as a result of the lightweight structure of MobileNet.
36

Facial Behavior and Pair Bonds in Hylobatids

Florkiewicz, Brittany Nicole 01 May 2016 (has links)
Among primates, humans have the largest and most complex facial repertoires, followed not by their closest living hominid relatives but by hylobatids. Facial behavior is an important component of primate communication that transfers and modulates intentions and motivations. However, why great variation in primate facial expressions evolved and why hylobatid facial repertoires seem to be more similar to humans than other apes is unclear. The current study compared 206 hours of video and 103 hours of focal animal data of facial expression repertoires, measures of pair bond strength, and behavioral synchrony of ten hylobatid pairs from three genera (Nomascus, Hoolock, and Hylobates) living at the Gibbon Conservation Center, Santa Clarita, CA. This study explored whether facial repertoire breath or frequency were linked to social parameters of pair-bonds, how facial expressions related to behavioral synchrony, and if facial feedback (i.e., the transfer of behaviors and intentions by mimicking observed facial expressions) were important between pair-partners. Intra-pair facial repertoires correlated strongly with repertoire composition and rate of use, suggesting that facial feedback was important, while behavioral synchrony showed no correlation with facial behavior. The results of this study suggest that larger facial repertoires contribute to strengthening pair bonds, because richer facial repertoires provide more opportunities for facial feedback which effectively creates a better ‘understanding’ between partners through smoother and better coordinated interaction patterns.
37

"Percepção de expressões faciais da emoção e lateralização cerebral". / Perception of facial expressions and brain lateralization.

Nelson Torro Alves 30 September 2004 (has links)
Tem sido freqüentemente discutido na literatura científica o papel que desempenha cada hemisfério cerebral no processamento da informação emocional. O estudo realizado teve por objetivo investigar o padrão de dominância hemisférica para a percepção das expressões faciais de alegria, tristeza, raiva e medo. Em dois experimentos realizados foi utilizada a técnica de estudo campo visual dividido com a apresentação taquitoscópica de estímulos por 150 ms na tela de um monitor. Os estímulos foram compostos com fotografias de faces de quatro indivíduos (2H, 2M) retiradas da série Pictures of Facial Affect. Vinte e um observadores destros (9H, 12M) participaram do experimento 1. Em cada tentativa eram apresentadas duas fotografias de faces, uma à esquerda e outra à direita do ponto de fixação na tela do computador, em quatro diferentes condições: 1) face com emoção à esquerda e face neutra à direita, 2) face neutra à esquerda e face com emoção à direita, 3) face com emoção à direita e à esquerda, 4) face neutra à direita e à esquerda. Em cada tentativa, os observadores determinaram o lado em que havia sido apresentada a face que aparentava expressar mais emoção. Dezessete observadores destros (8H, 9M) participaram do experimento 2. Em cada apresentação de estímulo, uma foto de face era apresentada à direita ou à esquerda do ponto de fixação, localizado no centro da tela, e do lado oposto era apresentado um retângulo cinza. Foram elaboradas as seguintes condições de estímulo: 1) face com emoção à esquerda e retângulo cinza à direita, 2)retângulo cinza à esquerda e face com emoção à direita, 3) face neutra à esquerda e retângulo cinza à direita, 4) retângulo cinza à esquerda e face neutra à direita. Em cada tentativa, os observadores determinaram se a face apresentada aparentava ou não possuir emoção. Os tempos de reação e os erros de julgamento foram submetidos a ANOVAs para medidas repetidas. No primeiro experimento, a emoção foi em geral detectada mais rapidamente em faces apresentadas no campo visual esquerdo (p<0,01). As expressões de tristeza e raiva também foram percebidas mais rapidamente quando apresentadas no campo visual esquerdo (p<0,05). Em ambos os experimentos, as expressões de alegria e medo foram percebidas mais rapidamente e mais acuradamente que as expressões de tristeza e raiva (p<0,001). A expressão de tristeza foi detectada mais facilmente em faces femininas e a expressão de raiva em faces masculinas (p<0,05). De maneira geral, entretanto, a emoção foi detectada mais facilmente em faces femininas. Em ambos os experimentos houve diferenças entre as faces dos quatro indivíduos que representavam as expressões faciais. O hemisfério direito mostrou-se superior ao esquerdo na percepção das expressões faciais, especialmente na percepção das expressões de tristeza e raiva. A vantagem perceptiva do hemisfério direito é mais evidente para as expressões que são detectadas com maior dificuldade. A percepção de expressões faciais pode ser afetada pelo gênero da face e pelas singularidades da expressão facial individual. / The role that each brain hemisphere plays in the processing of emotional information has been frequently discussed in the scientific literature. The aim of this study was to investigate the pattern of hemispheric dominance for the perception of the facial expressions of happiness, sadness, anger and fear. In two experiments the divided-visualfield technique was used with the taquitoscopic presentation of stimuli on a computer screen for 150 ms. The stimuli were composed with pictures of faces of four people (2M,2F) taken from the series Pictures of Facial Affect. Twenty one right-handed observers (9M,12F) took part in the experiment I. In each trial two pictures of faces were presented on the computer screen, one of them placed on the left side and the other one on the right side of the fixation point, in four different conditions: 1) face with emotion on the left and neutral face on the right, 2) neutral face on the left and face with emotion on the right, 3)face with emotion on the right and on the left, 4) neutral face on the right and on the left. In each trial the observers determined the side on which the face seemed to show greater emotional intensity. Seventeen right-handed observers (8M, 9F) took part in the experiment II. In each stimulus presentation, a picture of a face was presented either on the right or left side of the fixation point, placed on the center of the screen and, on the opposite side, a gray rectangle was presented. The following stimuli conditions were elaborated: 1) face with emotion on the left and gray rectangle on the right, 2) gray rectangle on the left and face with emotion on the right, 3) neutral face on the left and gray rectangle on the right, 4) gray rectangle on the left and neutral face on the right. In each trial the observers determined if the face presented had emotion or not. Time reactions and judgement errors were submitted to ANOVAs for repeated measures. In the first experiment, emotion was generally detected more quickly in faces presented on the left 10 visual field (p<0,01). The expressions of sadness and anger were also perceived more quickly when presented on the left visual field (p<0,05). In both experiments, expressions of happiness and fear were perceived more quickly and more accurately than expressions of sadness and anger (p<0,001). The expression of sadness was detected more easily in feminine faces and the expression of anger in masculine faces (p<0,05). In general, however, the emotion was detected more easily in feminine faces. In both experiments there were differences in the perception between the faces of the four individuals that represented the facial expressions. The right hemisphere showed superior to the left hemisphere in the perception of facial expressions, especially for the perception of expressions of sadness and anger. The perceptive advantage of the right hemisphere is more evident for the expressions that are detected with more difficulty. The perception of facial expressions can be affected by the gender of the face and the singularities of the individual facial expression.
38

Recognition of Facial Expressions of Six Emotions by Children with Specific Language Impairment

Atwood, Kristen Diane 21 July 2006 (has links) (PDF)
Over the past several years, research has shown that children with language impairment often have increased social difficulties. The purpose of this study was to take a closer look at the relationship between language ability and emotion understanding by examining the recognition of facial expressions in children with specific language impairment (SLI) and their typically developing peers. As such, this study is a follow-up investigation of the work done by Spackman, Fujiki, Brinton, Nelson, & Allen (2006). Children with SLI and their age- and gender-matched peers were asked to identify the following six facial expressions of emotion in a language-minimal manner: happiness, anger, fear, surprise, sadness, and disgust. Group performance was then compared for each of the emotions examined. This study found significant differences between the groups (SLI vs. typical), with the children without language impairment performing better than those with SLI. There was also a significant difference found for emotion, indicating that some emotions were identified more correctly than others. No significant effects were found for gender, nor were any interaction effects between variables found.
39

Secrets of a smile? Your gender and perhaps your biometric identity

Ugail, Hassan 11 June 2018 (has links)
No / With its numerous applications, automatic facial emotion recognition has recently become a very active area of research. Yet there has been little detailed study of the dynamic components of facial expressions. This article reviews research that shows gender is encoded in the dynamics of a smile, and how it may be possible to use the dynamic components of facial expressions as a form of biometric.
40

The computational face for facial emotion analysis: Computer based emotion analysis from the face

Al-dahoud, Ahmad January 2018 (has links)
Facial expressions are considered to be the most revealing way of understanding the human psychological state during face-to-face communication. It is believed that a more natural interaction between humans and machines can be undertaken through the detailed understanding of the different facial expressions which imitate the manner by which humans communicate with each other. In this research, we study the different aspects of facial emotion detection, analysis and investigate possible hidden identity clues within the facial expressions. We study a deeper aspect of facial expressions whereby we try to identify gender and human identity - which can be considered as a form of emotional biometric - using only the dynamic characteristics of the smile expressions. Further, we present a statistical model for analysing the relationship between facial features and Duchenne (real) and non-Duchenne (posed) smiles. Thus, we identify that the expressions in the eyes contain discriminating features between Duchenne and non-Duchenne smiles. Our results indicate that facial expressions can be identified through facial movement analysis models where we get an accuracy rate of 86% for classifying the six universal facial expressions and 94% for classifying the common 18 facial action units. Further, we successfully identify the gender using only the dynamic characteristics of the smile expression whereby we obtain an 86% classification rate. Likewise, we present a framework to study the possibility of using the smile as a biometric whereby we show that the human smile is unique and stable. / Al-Zaytoonah University

Page generated in 0.0979 seconds