• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • 1
  • 1
  • Tagged with
  • 8
  • 8
  • 8
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

The Social World Through Infants’ Eyes : How Infants Look at Different Social Figures

Schmitow, Clara A. January 2012 (has links)
This thesis aims to study how infants actively look at different social figures: parents and strangers. To study infants’ looking behavior in “live” situations, new methods to record looking behavior were tested. Study 1 developed a method to record looking behavior in “live” situations: a head-mounted camera. This method was calibrated for a number of angles and then used to measure how infants look at faces and objects in two “live” situations, a conversation and a joint action. High reliability was found for the head-mounted camera in horizontal positions and the possibility of using it in a number of “live” situations with infants from 6 to 14 months of age. In Study 2, the head-mounted camera and a static camera and were used in a “live” ambiguous situation to study infants’ preferences to refer to and to use the information from parents and strangers. The results from Experiment 1 of Study 2 showed that if no information is provided in ambiguous situations in the lab, infants at 10 months of age look more at the experimenter than at the parent. Further, Experiment 2 of Study 2 showed that the infants also used more of the emotional information provided by the experimenter than by the parent to regulate their behavior.  In Study 3, looking behavior was analyzed in detail when infants looked at pictures of their parents’ and strangers’ emotional facial expressions. Corneal eye tracking was used to record looking. In this study, the influence of identity, gender, emotional expressions and parental leave on looking behavior was analyzed. The results indicated that identity and experience of looking at others influences how infants discriminate emotions in pictures of facial expressions. Fourteen-month-old infants who had been with both parents in parental leave discriminated more emotional expressions in strangers than infants who only had one parent on leave. Further, they reacted with larger pupil dilation toward the parent who was actually in parental leave than to the parent not on leave. Finally, fearful emotional expressions were more broadly scanned than neutral or happy facial expressions. The results of these studies indicate that infants discriminate between mothers’, fathers’ and strangers’ emotional facial expressions and use the other people’s expressions to regulate their behavior. In addition, a new method, a head-mounted camera was shown to capture infants’ looking behavior in “live” situations.
2

An ERP Study of Responses to Emotional Facial Expressions: Morphing Effects on Early-Latency Valence Processing

Ravich, Zoe 01 April 2012 (has links)
Early-latency theories of emotional processing state that at least coarse monitoring of the emotional valence (a pleasure-displeasure continuum) of facial expressions should be both rapid and highly automated (LeDoux, 1995; Russell, 1980). Research has largely substantiated early-latency differential processing of emotional versus non-emotional facial expressions; however, the effect of valence on early-latency processing of emotional facial expression remains unclear. In an effort to delineate the effects of valence on early-latency emotional facial expression processing, the current investigation compared ERP responses to positive (happy and surprise), neutral, and negative (afraid and sad) basic facial expression photographs as well as to positive (happy-surprise), neutral (afraid-surprise, happy-afraid, happy-sad, sad-surprise), and negative (sad-afraid) morph facial expression photographs during a valence-rating task. Morphing manipulations have been shown to decrease the familiarity of facial patterns and thus preclude any overlearned responses to specific facial codes. Accordingly, it was proposed that morph stimuli would disrupt more detailed emotional identification to reveal a valence response independent of a specific identifiable emotion (Balconi & Lucchiari, 2005; Schweinberger, Burton & Kelly, 1999). ERP results revealed early-latency differentiation between positive, neutral, and negative morph facial expressions approximately 108 milliseconds post-stimulus (P1) within the right electrode cluster; negative morph facial expressions continued to elicit significantly smaller ERP amplitudes than other valence categories approximately 164 milliseconds post-stimulus (N170). Consistent with previous imaging research on emotional facial expression processing, source localization revealed substantial dipole activation within regions of the mesolimbic dopamine system. Thus, these findings confirm rapid valence processing of facial expressions and suggest that negative valence processing may continue to modulate subsequent structural facial processing.
3

Children's Self-reported Emotions and Emotional Facial Expressions Following Moral Transgressions

Dys, Sebastian P. 22 November 2013 (has links)
This study examined self-reported emotions and emotional facial expressions following moral transgressions using an ethnically diverse sample of 242 4-, 8-, and 12-year-old children. Self-reported emotions were examined in response to three transgression contexts: an intentional harm, an instance of social exclusion, and an omission of a prosocial duty. Children’s emotional expressions of sadness, happiness, anger, fear and disgust were analyzed immediately after being asked how they would feel if they had committed one of the described transgressions. Emotional expressions were scored using automated emotion recognition software. Four-year-olds reported significantly more happiness as compared to 8- and 12-year-olds. In addition, self-reports of sadness decreased between 8- and 12-year-olds, while self-reported guilt increased between these age groups. Furthermore, 4- and 8-year-olds demonstrated higher levels of facially expressed happiness than 12-year-olds. These findings highlight the role of automatic affective and controlled cognitive processes in the development of children’s emotions following moral transgressions.
4

Children's Self-reported Emotions and Emotional Facial Expressions Following Moral Transgressions

Dys, Sebastian P. 22 November 2013 (has links)
This study examined self-reported emotions and emotional facial expressions following moral transgressions using an ethnically diverse sample of 242 4-, 8-, and 12-year-old children. Self-reported emotions were examined in response to three transgression contexts: an intentional harm, an instance of social exclusion, and an omission of a prosocial duty. Children’s emotional expressions of sadness, happiness, anger, fear and disgust were analyzed immediately after being asked how they would feel if they had committed one of the described transgressions. Emotional expressions were scored using automated emotion recognition software. Four-year-olds reported significantly more happiness as compared to 8- and 12-year-olds. In addition, self-reports of sadness decreased between 8- and 12-year-olds, while self-reported guilt increased between these age groups. Furthermore, 4- and 8-year-olds demonstrated higher levels of facially expressed happiness than 12-year-olds. These findings highlight the role of automatic affective and controlled cognitive processes in the development of children’s emotions following moral transgressions.
5

Effets de la rumination induite sur l’inhibition des interférences émotionnelles

Ferron, Jean-Philippe 08 1900 (has links)
La rumination est un style de pensées persistantes, répétitives et négatives, caractérisé par de la passivité et un sentiment d’impuissance, centré sur les émotions ressenties en réponse à un événement négatif passé. Des études suggèrent qu’elle est associée à des altérations du fonctionnement cognitif, mais certains de ses mécanismes cognitifs sont encore peu compris. Effectivement, des preuves d’une association entre la rumination et une altération de la capacité à inhiber la distraction (IID) en provenance d’informations négatives existent, mais les études sur le sujet sont contradictoires. L’objectif de ce mémoire était de clarifier la nature de cette relation. Dans deux expériences, la performance à des tâches d’IID de participants chez qui la rumination était induite (groupe expérimental) était comparée à celle de participants ne ruminant pas (groupe contrôle). Dans la première, la tâche consistait à identifier la direction d’une flèche cible en ignorant d’autres flèches présentées aux flancs de la cible. Les résultats n’ont pu montrer aucune différence de performance attribuable à la rumination. Dans la deuxième, plutôt que des flèches, les participants devaient identifier si l’expression faciale émotionnelle d’un visage cible était neutre, positive ou négative. Les résultats ont montré que les participants du groupe expérimental étaient plus facilement distraits lors de l’identification d’une cible positive et l’étaient moins pour une cible négative. La persistance de la rumination sur des informations négatives pourrait s’expliquer, entre autres, par un renforcement de l’IID spécifiquement pour des pensées négatives et par un affaiblissement de cette capacité pour des pensées positives. / Rumination is a style of persistent, repetitive, and negative thinking, characterized by passivity and by a feeling of helplessness, that is centered around the emotions experienced following a past negative event. There exists evidence of a link between rumination and impairments of cognitive functioning, but some of its cognitive mechanisms are not well understood. Indeed, there is evidence of a link between rumination and impairments of the ability to inhibit interferences from distracting negative stimulus (IIS), but the studies investigating these links have contradictory results. The goal of this thesis was to clarify the nature of this relation. In two experiments, the performance to IIS tasks was compared between an experimental group of participants in which rumination was induced and a control group of participants in which it was not. In the first experiment, the task consisted of identifying the pointing direction of a target arrow while ignoring flanking distracting arrows. There were no differences in results explainable by rumination. In the second, instead of arrows, the participants had to identify whether the facial expression of a target face was neutral, positive, or negative. The participants in the experimental group were more easily distracted when they had to identify a positive target while the were less easily distracted when it was negative. The persistence of rumination on negative information may in part be explained by a reinforcement of the ability to IIS for negative thoughts and by a weakening of this ability for positive thoughts.
6

Alexithymia Is Associated With Deficits in Visual Search for Emotional Faces in Clinical Depression

Suslow, Thomas, Günther, Vivien, Hensch, Tilman, Kersting, Anette, Bodenschatz, Charlott Maria 31 March 2023 (has links)
Background: The concept of alexithymia is characterized by difficulties identifying and describing one’s emotions. Alexithymic individuals are impaired in the recognition of others’ emotional facial expressions. Alexithymia is quite common in patients suffering from major depressive disorder. The face-in-the-crowd task is a visual search paradigm that assesses processing of multiple facial emotions. In the present eye-tracking study, the relationship between alexithymia and visual processing of facial emotions was examined in clinical depression. Materials and Methods: Gaze behavior and manual response times of 20 alexithymic and 19 non-alexithymic depressed patients were compared in a face-in-the-crowd task. Alexithymia was empirically measured via the 20-item Toronto Alexithymia-Scale. Angry, happy, and neutral facial expressions of different individuals were shown as target and distractor stimuli. Our analyses of gaze behavior focused on latency to the target face, number of distractor faces fixated before fixating the target, number of target fixations, and number of distractor faces fixated after fixating the target. Results: Alexithymic patients exhibited in general slower decision latencies compared to non-alexithymic patients in the face-in-the-crowd task. Patient groups did not differ in latency to target, number of target fixations, and number of distractors fixated prior to target fixation. However, after having looked at the target, alexithymic patients fixated more distractors than non-alexithymic patients, regardless of expression condition. Discussion: According to our results, alexithymia goes along with impairments in visual processing of multiple facial emotions in clinical depression. Alexithymia appears to be associated with delayed manual reaction times and prolonged scanning after the first target fixation in depression, but it might have no impact on the early search phase. The observed deficits could indicate difficulties in target identification and/or decision-making when processing multiple emotional facial expressions. Impairments of alexithymic depressed patients in processing emotions in crowds of faces seem not limited to a specific affective valence. In group situations, alexithymic depressed patients might be slowed in processing interindividual differences in emotional expressions compared with non-alexithymic depressed patients. This could represent a disadvantage in understanding non-verbal communication in groups.
7

ERP Analyses of Perceiving Emotions and Eye Gaze in Faces: Differential Effects of Motherhood and High Autism Trait

Bagherzadeh-Azbari, Shadi 08 May 2023 (has links)
Die Blickrichtung und ihre Richtung sind wichtige nonverbale Hinweise für die Etablierung von sozialen Interaktionen und die Wahrnehmung von emotionalen Gesichtsausdrücken bei anderen. Ob der Blick direkt auf den Betrachter gerichtet ist (direkter Blick) oder abgewendet (abgewandter Blick), beeinflusst unsere soziale Aufmerksamkeit und emotionale Reaktionen. Dies deutet darauf hin, dass Emotionen und Blickrichtung informative Werte haben, die sich möglicherweise in frühen oder späteren Stadien der neurokognitiven Verarbeitung interagieren. Trotz theoretischer Grundlage, der geteilten Signal-Hypothese (Adams & Kleck, 2003), gibt es einen Mangel an strukturierten elektrophysiologischen Untersuchungen zu den Wechselwirkungen zwischen Emotionen und Blickrichtung sowie ihren neuronalen Korrelaten und wie sie sich in verschiedenen Bevölkerungsgruppen unterscheiden. Um diese Lücke zu schließen, verwendete diese Doktorarbeit ereigniskorrelierte Hirnpotentiale (ERPs), um die Reaktionen auf emotionale Ausdrücke und Blickrichtung in einem neuen Paradigma zu untersuchen, das statischen und dynamischen Blick mit Gesichtsausdrücken kombiniert. Es wurden drei verschiedene Populationen untersucht. Studie 1 untersuchte in einer normalen Stichprobe die Amplituden der ERP-Komponenten, die durch die erstmalige Präsentation von Gesichtern und nachfolgende Änderungen der Blickrichtung in der Hälfte der Durchgänge ausgelöst wurden. In Studie 2 wurden aufgrund der atypischen Gesichtsverarbeitung und verminderten Reaktionen auf Augenblick beim Autismus die ERPs und Augenbewegungen bei zwei Stichproben von Kindern mit unterschiedlichem Schweregrad ihrer Autismusmerkmale untersucht. In Studie 3 wurde in einer großen Stichprobe die vermutlich erhöhte Sensitivität bei der Emotionsverarbeitung und Reaktion auf Augenblick bei Müttern im postpartalen Zeitraum mit besonderem Fokus auf die Gesichter von Säuglingen untersucht. Zusammenfassend zeigen die Ergebnisse der drei Studien, dass in sozialen Interaktionen die emotionalen Effekte von Gesichtern durch die dynamische Blickrichtung moduliert werden. / The eye gaze and its direction are important and relevant non-verbal cues for the establishment of social interactions and the perception of others’ emotional facial expressions. Gaze direction itself, whether eyes are looking straight at the viewer (direct gaze) or whether they look away (averted gaze), affects our social attention and emotional response. This implies that both emotion and gaze have informational values, which might interact at early or later stages of neurocognitive processing. Despite the suggestion of a theoretical basis for this interaction, the shared signal hypothesis (Adams & Kleck, 2003), there is a lack of structured electrophysiological investigations into the interactions between emotion and gaze and their neural correlates, and how they vary across populations. Addressing this need, the present doctoral dissertation used event-related brain potentials (ERPs) to study responses to emotional expressions and gaze direction in a novel paradigm combining static and dynamic gaze with facial expressions. The N170 and EPN were selected as ERP components believed to reflect gaze perception and reflexive attention, respectively. Three different populations were investigated. Study 1, in a normal sample, investigated the amplitudes of the ERP components elicited by the initial presentation of faces and subsequent changes of gaze direction in half of the trials. In Study 2, based on the atypical face processing and diminished responses to eye gaze in autism, the ERPs and eye movements were examined in two samples of children varying in the severity of their autism traits. In Study 3, In a large sample, I addressed the putatively increased sensitivity in emotion processing and response to eye gaze in mothers during their postpartum period with a particular focus on infant's faces. Taken together, the results from three studies demonstrate that in social interactions, the emotional effects of faces are modulated by dynamic gaze direction.
8

Le décodage des expressions faciales émotionnelles à travers différentes bandes de fréquences spatiales et ses interactions avec l’anxiété

Harel, Yann 08 1900 (has links)
Le décodage des expressions faciales émotionnelles (EFE) est une fonction clé du système visuel humain puisqu’il est à la base de la communication non-verbale sur laquelle reposent les interactions sociales. De nombreuses études suggèrent un traitement différentiel des attributs diagnostiques du visage au sein des basses et des hautes fréquences spatiales (FS), respectivement sous-tendu par les voies magno- et parvocellulaires. En outre, des conditions telles que l’anxiété sociale sont susceptibles d’affecter ce traitement et d’entrainer une modulation des potentiels reliés aux évènements (PRE). Cette étude explore la possibilité de prédire le niveau d’anxiété social des individus à partir des corrélats électrophysiologiques du décodage d’EFE dans différentes bandes de FS. À cette fin, les PRE de 26 participants (âge moyen = 23.7 ± 4.7) ont été enregistrés lors de la présentation visuelle d’expressions neutres, de joie ou de colère filtrées pour ne retenir que les basses, moyennes ou hautes FS. L’anxiété sociale a été évaluée par l’administration préalable du questionnaire LSAS. Les latences et pics d’amplitude de la P100, N170, du complexe N2b/P3a et de la P3b ont été analysés statistiquement et utilisés pour entrainer différents algorithmes de classification. L’amplitude de la P100 était reliée au contenu en FS. La N170 a montré un effet des EFE. Le complexe N2b/P3a était plus ample pour les EFE et plus précoce pour les hautes FS. La P3b était moins ample pour les visages neutres, qui étaient aussi plus souvent omis. L’analyse discriminante linéaire a montré une précision de décodage d’en moyenne 56.11% au sein des attributs significatifs. La nature de ces attributs et leur sensibilité à l’anxiété sociale sera discutée. / The decoding of emotional facial expressions (EFE) is a key function of the human visual system since it lays at the basis of non-verbal communication that allows social interactions. Numerous studies suggests that the processing of faces diagnostic features may take place differently for low and high spatial frequencies (SF), respectively in the magno- and parvocellular pathways. Moreover, conditions such as social anxiety are supposed to influence this processing and the associated event-related potentials (ERP). This study explores the feasibility of predicting social anxiety levels using electrophysiological correlates of EFE processing across various SF bands. To this end, ERP from 26 participants (mean age = 23.7 ± 4.7) years old were recorded during visual presentation of neutral, angry and happy facial expressions, filtered to retain only low, medium or high SF. Social anxiety was previously assessed using the LSAS questionnary. Peak latencies and amplitudes of the P100, N170, N2b/P3a complex and P3b components were statistically analyzed and used to feed supervised machine learning algorithms. P100 amplitude was linked to SF content. N170 was effected by EFE. N2b/P3a complex was larger for EFE and earlier for high SF. P3b was lower for neutral faces, which were also more often omitted. The linear discriminant analysis showed a decoding accuracy across significant features with a mean of 56.11%. The nature of these features and their sensitivity to social anxiety will be discussed.

Page generated in 0.279 seconds