• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 190
  • 22
  • 18
  • 9
  • 5
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • 3
  • 3
  • 2
  • Tagged with
  • 329
  • 329
  • 70
  • 65
  • 64
  • 55
  • 54
  • 52
  • 50
  • 37
  • 32
  • 27
  • 26
  • 24
  • 23
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
161

Efeitos do escitalopram sobre a identificação de expressões faciais / Effects of escitalopram on the processing of emotional faces.

Wolme Cardoso Alves Neto 16 May 2008 (has links)
ALVES NETO, W.C. Efeitos do escitalopram sobre a identificação de expressões faciais. Ribeirão Preto, SP: Faculdade de Medicina de Ribeirão Preto, Universidade de São Paulo; 2008. Os inibidores seletivos da recaptura de serotonina (ISRS) têm sido utilizados com sucesso para o tratamento de diversas patologias psiquiátricas. Sua eficácia clínica é atribuída a uma potencialização da neurotransmissão serotoninérgica, mas pouco ainda é conhecido sobre os mecanismos neuropsicológicos envolvidos nesse processo. Várias evidências sugerem que a serotonina estaria envolvida, entre outras funções, na regulação do comportamento social, nos processos de aprendizagem e memória e no processamento de emoções. O reconhecimento de expressões faciais de emoções básicas representa um valioso paradigma para o estudo do processamento de emoções, pois são estímulos condensados, uniformes e de grande relevância para o funcionamento social. O objetivo do estudo foi avaliar os efeitos da administração aguda e por via oral do escitalopram, um ISRS, no reconhecimento de expressões faciais de emoções básicas. Uma dose oral de 10 mg de escitalopram foi administrada a doze voluntários saudáveis do sexo masculino, em modelo duplo-cego, controlado por placebo, em delineamento cruzado, ordem randômica, 3 horas antes de realizarem a tarefa de reconhecimento de expressões faciais, com seis emoções básicas raiva, medo, tristeza, asco, alegria e surpresa mais a expressão neutra. As faces foram digitalmente modificadas de forma a criar um gradiente de intensidade entre 10 e 100% de cada emoção, com incrementos sucessivos de 10%. Foram registrados os estados subjetivos de humor e ansiedade ao longo da tarefa e o desempenho foi avaliado pela medida de acurácia (número de acertos sobre o total de estímulos apresentados). De forma geral, o escitalopram interferiu no reconhecimento de todas as expressões faciais, à exceção de medo. Especificamente, facilitou a identificação das faces de tristeza e prejudicou o reconhecimento de alegria. Quando considerado o gênero das faces, esse efeito foi observado para as faces masculinas, enquanto que para as faces femininas o escitalopram não interferiu com o reconhecimento de tristeza e aumentou o de alegria. Além disso, aumentou o reconhecimento das faces de raiva e asco quando administrado na segunda sessão e prejudicou a identificação das faces de surpresa nas intensidades intermediárias de gradação. Também apresentou um efeito positivo global sobre o desempenho na tarefa quando administrado na segunda sessão. Os resultados sugerem uma modulação serotoninérgica sobre o reconhecimento de expressões faciais emocionais e sobre a evocação de material previamente aprendido. / ALVES NETO, W.C. Effects of escitalopram on the processing of emotional faces. Ribeirão Preto, SP: Faculty of Medicine of Ribeirão Preto, University of São Paulo; 2008. The selective serotonin reuptake inhibitors (SSRI) have been used successfully for the treatment of various psychiatry disorders. The SSRI clinical efficacy is attributed to an enhancement of the serotonergic neurotransmission, but little is known about the neuropsychological mechanisms underlying this process. Several evidences suggest that serotonin is involved with the regulation of social behavior, learning and memory process and emotional processing. The recognition of basic emotions on facial expressions represents an useful task to study the emotional processing, since they are a condensate, uniform and important stimuli for social functioning. The aim of the study was to verify the effects of the SSRI escitalopram on the recognition of facial emotional expressions. Twelve healthy males completed two experimental sessions each (crossover design), in a randomized, balanced order, double-blind design. An oral dose of 10 mg of escitalopram was administered 3 hours before they performed an emotion recognition task with six basic emotions angry, fear, sadness, disgust, happiness and surprise and neutral expression. The faces were digitally morphed between 10% and 100% of each emotional standard, creating a 10% steps gradient. The subjective mood and anxiety states through the task were recorded and the performance through the task was defined by the accuracy measure (number of correct answers divided by the total of stimuli presented). In general, except of fear, escitalopram interfered with all the emotions tested. Specifically, facilitated the recognition of sadness, while impaired the identification of happiness. When the gender of the faces was analyzed, this effect was seen in male, but not female faces, where it improves the recognition of happiness. In addition, improves the recognition of angry and disgusted faces when administered at the second session and impaired the identification of surprised faces at intermediate levels of intensity. It also showed a global positive effect on task performance when administered at the second session. The results indicate a serotonergic modulation on the recognition of emotional faces and the recall of previous learned items.
162

An Investigation into Modern Facial Expressions Recognition by a Computer

January 2019 (has links)
abstract: Facial Expressions Recognition using the Convolution Neural Network has been actively researched upon in the last decade due to its high number of applications in the human-computer interaction domain. As Convolution Neural Networks have the exceptional ability to learn, they outperform the methods using handcrafted features. Though the state-of-the-art models achieve high accuracy on the lab-controlled images, they still struggle for the wild expressions. Wild expressions are captured in a real-world setting and have natural expressions. Wild databases have many challenges such as occlusion, variations in lighting conditions and head poses. In this work, I address these challenges and propose a new model containing a Hybrid Convolutional Neural Network with a Fusion Layer. The Fusion Layer utilizes a combination of the knowledge obtained from two different domains for enhanced feature extraction from the in-the-wild images. I tested my network on two publicly available in-the-wild datasets namely RAF-DB and AffectNet. Next, I tested my trained model on CK+ dataset for the cross-database evaluation study. I prove that my model achieves comparable results with state-of-the-art methods. I argue that it can perform well on such datasets because it learns the features from two different domains rather than a single domain. Last, I present a real-time facial expression recognition system as a part of this work where the images are captured in real-time using laptop camera and passed to the model for obtaining a facial expression label for it. It indicates that the proposed model has low processing time and can produce output almost instantly. / Dissertation/Thesis / Masters Thesis Computer Science 2019
163

Le développement de la perception des expressions faciales / The development of facial expressions perception

Bayet, Laurie 26 November 2015 (has links)
Cette thèse se propose d'examiner le développement de la perception des expressions faciales émotionnelles en le replaçant dans le cadre théorique de la perception des visages: séparation entre aspects variants (expression, regard) et invariants (genre, type), rôle de l'expérience, attention sociale. Plus spécifiquement, nous avons cherché à mettre en évidence l'existence, tant chez l'enfant que chez le nourrisson, d'interactions réciproques entre la perception d'expressions faciales de colère, de sourire ou de peur et la perception du genre (Études 1-2), la perception du regard (Étude 3), et la détection des visages (Étude 4).Dans un premier temps, nous avons montré que les adultes et les enfants de 5 à 12 ans tendent à catégoriser les visages en colère comme masculins (Étude 1). Comparer les performances humaines avec celles de classifieurs automatique suggère que ce biais reflète l'utilisation de certains traits et relations de second-ordre des visages pour en déterminer le genre. Le biais est identique à tous les âges étudiés ainsi que pour les visages de types non-familiers. Dans un second temps, nous avons testé si, chez le nourrisson, la perception du sourire dépend de dimensions invariantes du visage sensibles à l'expérience - le genre et le type (Étude 2). Les nourrissons ont généralement plus d'expérience avec les visages féminins d'un seul type. Les nourrissons de 3.5 mois montrent une préférence visuelle pour les visages souriants (dents visibles, versus neutre, de type familier) lorsque ceux-ci sont féminins; l'inverse est observé lorsqu'ils sont masculins. L'effet n'est pas répliqué lorsque les dents des visages souriants (d'un type familier ou non) ne sont pas visibles. Nous avons cherché à généraliser ces résultats à une tâche de référencement d'objet chez des nourrissons de 3.5, 9 et 12 mois (Étude 3). Les objets préalablement référencés par des visages souriants étaient autant regardés que les objets préalablement référencés par des visages neutres, quel que soit le groupe d'âge ou le genre du visage, et ce malgré des différences en terme de suivi du regard. Enfin, en employant une mesure univariée (préférence visuelle pour le visage) et une mesure multivariée (évidence globale distinguant le visage du bruit) de la détection du visage à chaque essai, associées à une modélisation des courbes psychométriques par modèles non-linéaire mixtes, nous mettons en évidence une meilleure détection des visages de peur (comparés aux visages souriants) dans le bruit phasique chez les nourrissons à 3.5, 6 et 12 mois (Étude 4).Ces résultats éclairent le développement précoce et le mécanisme des relations entre genre et émotion dans la perception des visages ainsi que de la sensibilité à la peur. / This thesis addressed the question of how the perception of emotional facial expressions develops, reframing it in the theoretical framework of face perception: the separation of variant (expression, gaze) and invariant (gender, race) streams, the role of experience, and social attention. More specifically, we investigated how in infants and children the perception of angry, smiling, or fearful facial expressions interacts with gender perception (Studies 1-2), gaze perception (Study 3), and face detection (Study 4).In a first study, we found that adults and 5-12 year-old children tend to categorize angry faces as male (Study 1). Comparing human performance with that of several automatic classifiers suggested that this reflects a strategy of using specific features and second-order relationships in the face to categorize gender. The bias was constant over all ages studied and extended to other-race faces, further suggesting that it doesn't require extensive experience. A second set of studies examined whether, in infants, the perception of smiling depends on experience-sensitive, invariant dimensions of the face such as gender and race (Study 2). Indeed, infants are typically most familiar with own-race female faces. The visual preference of 3.5 month-old infants for open-mouth, own-race smiling (versus neutral) faces was restricted to female faces and reversed in male faces. The effect did not replicate with own- or other-race closed-mouth smiles. We attempted to extend these results to an object-referencing task in 3.5-, 9- and 12-month-olds (Study 3). Objects previously referenced by smiling faces attracted similar attention as objects previously cued by neutral faces, regardless of age group and face gender, and despite differences in gaze following. Finally, we used univariate (face side preference) and multivariate (face versus noise side decoding evidence) trial-level measures of face detection, coupled with non-linear mixed modeling of psychometric curves, to reveal the detection advantage of fearful faces (compared to smiling faces) embedded in phase-scrambled noise in 3.5-, 6-, and 12-month-old infants (Study 4). The advantage was as or more evident in the youngest group than in the two older age groups.Taken together, these results provide insights into the early ontogeny and underlying cause of gender-emotion relationships in face perception and the sensitivity to fear.
164

An Intercultural Analysis of Differences in Appropriateness Ratings of Facial Expressions Between Japanese and American Subjects

Peschka-Daskalos, Patricia Jean 28 April 1993 (has links)
In 1971 Paul Ekman posited his Neuro-Cultural Theory of Emotion which stated that expressions of emotion are universal but controlled by cultural display rules. This thesis tests the Neuro-Cultural Theory by having subjects from two cultures, Japan and the United States, judge the perceived appropriateness facial expressions in social situations. Preliminary procedures resulted in a set of scenarios in which socially appropriate responses were deemed to be either "Happy", "Angry" or "Surprised". Data in the experimental phase of the study were collected using a questionnaire format. Through the use of a 5-point Likert scale, each subject rated the appropriateness of happy, anger and surprise expressions in positive, negative and ambiguous social situations. Additionally, the subjects were asked to label each expression in each situation. The responses were analyzed statistically using Analysis of Variance procedures. Label percentages were also calculated for: the second task in the study. No support was found for two of the three research hypotheses, and only partial support was found for a third research hypothesis. These results were discussed in terms of the need for greater theoretical and methodological refinements.
165

Mixed reality interactive storytelling : acting with gestures and facial expressions

Martin, Olivier 04 May 2007 (has links)
This thesis aims to answer the following question : “How could gestures and facial expressions be used to control the behavior of an interactive entertaining application?”. An answer to this question is presented and illustrated in the context of mixed reality interactive storytelling. The first part focuses on the description of the Artificial Intelligence (AI) mechanisms that are used to model and control the behavior of the application. We present an efficient real-time hierarchical planning engine, and show how active modalities (such as intentional gestures) and passive modalities (such as facial expressions) can be integrated into the planning algorithm, in such a way that the narrative (driven by the behavior of the virtual characters inside the virtual world) can effectively evolve in accordance with user interactions. The second part is devoted to the automatic recognition of user interactions. After briefly describing the implementation of a simple but robust rule-based gesture recognition system, the emphasis is set on facial expression recognition. A complete solution integrating state-of-the-art techniques along with original contributions is drawn. It includes face detection, facial feature extraction and analysis. The proposed approach combines statistical learning and probabilistic reasoning in order to deal with the uncertainty associated with the process of modeling facial expressions.
166

Emotional Empathy, Facial Reactions, and Facial Feedback

Andréasson, Per January 2010 (has links)
The human face has a fascinating capability to express emotions. The facial feedback hypothesis suggests that the human face not only expresses emotions but is also able to send feedback to the brain and modulate the ongoing emotional experience. It has furthermore been suggested that this feedback from the facial muscles could be involved in empathic reactions. This thesis explores the concept of emotional empathy and relates it to two aspects concerning activity in the facial muscles. First, do people high versus low in emotional empathy differ in regard to in what degree they spontaneously mimic emotional facial expressions? Second, is there any difference between people with high as compared to low emotional empathy in respect to how sensitive they are to feedback from their own facial muscles? Regarding the first question, people with high emotional empathy were found to spontaneously mimic pictures of emotional facial expressions while people with low emotional empathy were lacking this mimicking reaction. The answer to the second question is a bit more complicated. People with low emotional empathy were found to rate humorous films as funnier in a manipulated sulky facial expression than in a manipulated happy facial expression, whereas people with high emotional empathy did not react significantly. On the other hand, when the facial manipulations were a smile and a frown, people with low as well as high emotional empathy reacted in line with the facial feedback hypothesis. In conclusion, the experiments in the present thesis indicate that mimicking and feedback from the facial muscles may be involved in emotional contagion and thereby influence emotional empathic reactions. Thus, differences in emotional empathy may in part be accounted for by different degree of mimicking reactions and different emotional effects of feedback from the facial muscles.
167

Emotion Recognition from Eye Region Signals using Local Binary Patterns

Jain, Gaurav 08 December 2011 (has links)
Automated facial expression analysis for Emotion Recognition (ER) is an active research area towards creating socially intelligent systems. The eye region, often considered integral for ER by psychologists and neuroscientists, has received very little attention in engineering and computer sciences. Using eye region as an input signal presents several bene ts for low-cost, non-intrusive ER applications. This work proposes two frameworks towards ER from eye region images. The first framework uses Local Binary Patterns (LBP) as the feature extractor on grayscale eye region images. The results validate the eye region as a signi cant contributor towards communicating the emotion in the face by achieving high person-dependent accuracy. The system is also able to generalize well across di erent environment conditions. In the second proposed framework, a color-based approach to ER from the eye region is explored using Local Color Vector Binary Patterns (LCVBP). LCVBP extend the traditional LBP by incorporating color information extracting a rich and a highly discriminative feature set, thereby providing promising results.
168

Emotion Recognition from Eye Region Signals using Local Binary Patterns

Jain, Gaurav 08 December 2011 (has links)
Automated facial expression analysis for Emotion Recognition (ER) is an active research area towards creating socially intelligent systems. The eye region, often considered integral for ER by psychologists and neuroscientists, has received very little attention in engineering and computer sciences. Using eye region as an input signal presents several bene ts for low-cost, non-intrusive ER applications. This work proposes two frameworks towards ER from eye region images. The first framework uses Local Binary Patterns (LBP) as the feature extractor on grayscale eye region images. The results validate the eye region as a signi cant contributor towards communicating the emotion in the face by achieving high person-dependent accuracy. The system is also able to generalize well across di erent environment conditions. In the second proposed framework, a color-based approach to ER from the eye region is explored using Local Color Vector Binary Patterns (LCVBP). LCVBP extend the traditional LBP by incorporating color information extracting a rich and a highly discriminative feature set, thereby providing promising results.
169

Expressive Control and Emotion Perception: the Impact of Expressive Suppression and Mimicry on Sensitivity to Facial Expressions of Emotion

Schneider, Kristin Grace 28 May 2008 (has links)
<p>Recent studies have linked expressive suppression to impairments in interpersonal functioning, but the mechanism underlying this relationship has not been well articulated. One possibility is that the individual who engages in expressive suppression is impaired in perceiving the emotions of others, a critical ability in successful interpersonal functioning. In the current study, participants were presented with a series of photographs of facial expressions that were manipulated so that they appeared to "morph" from neutral into full emotion expressions. As they viewed these images, participants were instructed to identify the expression as quickly as possible, by selecting one of the six emotion labels (happiness, sadness, fear, anger, surprise, and disgust) on the screen. Prior to this task, participants were randomized to one of three groups: instructed to mimic the expressions on the screen, instructed to suppress all emotion expressions, or not given specific instructions on how to control expressions (the control group). The speed with which participants accurately identified emotional expressions (emotion sensitivity) was the primary variable of interest. Overall, participants in the suppression condition were found to be slower to accurately identify emotions, while no statistically-significant differences were found between the mimicry and no-instructions conditions. The decreased emotion sensitivity in the suppression group could not be accounted for by impulsive responding, decreased sensitivity at full expression, or perceived difficulty of task.</p> / Dissertation
170

Recognition Of Facial Expressions In Alcohol Dependent Inpatients

Dursun, Pinar 01 June 2007 (has links) (PDF)
ABSTRACT RECOGNITION OF EMOTIONAL FACIAL EXPRESSION IN ALCOHOL DEPENDENT INPATIENTS Dursun, Pinar M.S., Department of Psychology Supervisor: Assoc. Prof. Faruk Gen&ccedil / &ouml / z June 2007, 130 pages The ability to recognize emotional facial expressions (EFE) is very critical for social interaction and daily functioning. Recent studies have shown that alcohol dependent individuals have deficits in the recognition of these expressions. Thereby, the objective of this study was to explore the presence of impairment in the decoding of universally recognized facial expressions -happiness, sadness, anger, disgust, fear, surprise, and neutral expressions- and to measure their manual reaction times (RT) toward these expressions in alcohol dependent inpatients. Demographic Information Form, CAGE Alcoholism Inventory, State- Trait Anxiety Inventory (STAI), Beck Depression Inventory (BDI), The Symptom Checklist, and lastly a constructed computer program (Emotion Recognition Test) were administered to 50 detoxified alcohol dependent inpatients and 50 matched-control group participants. It was hypothesized that alcohol dependents would show more deficits in the accuracy of reading EFE and would react more rapidly toward negative EFE -fear, anger, disgust, sadness than control group. Series of ANOVA, ANCOVA, MANOVA and MANCOVA analyses revealed that alcohol dependent individuals were more likely to have depression and anxiety disorders than non-dependents. They recognized less but responded faster toward disgusted expressions than non-dependent individuals. On the other hand, two groups did not differ significantly in the total accuracy responses. In addition, the levels of depression and anxiety did not affect the recognition accuracy or reaction times. Stepwise multiple regression analysis indicated that obsessive-compulsive subscale of SCL, BDI, STAI-S Form, and the recognition of fearful as well as disgusted expressions were associated with alcoholism. Results were discussed in relation to the previous findings in the literature. The inaccurate identification of disgusted faces might be associated with organic deficits resulted from alcohol consumption or cultural factors that play very important role in displaying expressions.

Page generated in 0.0902 seconds