• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 13
  • 2
  • 1
  • 1
  • 1
  • 1
  • Tagged with
  • 36
  • 36
  • 16
  • 13
  • 9
  • 7
  • 7
  • 6
  • 6
  • 6
  • 4
  • 4
  • 4
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Differential Perception of Auditory and Visual Aspects of Emotion in 7- to 15-Month-Old Infants

Kim, Lawrence N. January 2018 (has links)
Infant-directed registers are emotion communication, conveying feelings and intentions to infants and toddlers that may facilitate and modulate attention and language learning. As infants are attracted to emotion, it is essential to understand how infants process emotional information. This study used an infant-controlled habituation paradigm to examine how 7- to 15-month-old infants discriminate changes in visual emotion, auditory emotion, or visual+auditory emotion after being habituated to a bimodal emotion display. The purpose of this study was to examine which modality (facial emotion; vocal emotion) was more salient for infants to discriminate emotions in the context of bimodal stimulation. Infants were habituated to happy audiovisual displays then received four test trials, during which neither source of emotion information was changed (control), just the auditory emotion was changed, just the visual emotion was changed, or both sources of emotion information were changed. It was predicted that infants would show the greatest recovery of attention to a change in visual emotion than when only visual information was changed, but less than when both auditory and visual information were changed. However, the results showed that infants were equally sensitive to all three types of emotion change. These results are discussed in terms of concurrent conceptualizations of how emotion processing is related to negative bias and experience with two emotions. / Master of Science / When we interact with infants, we convey feelings and intentions to infants that may facilitate and modulate attention and language learning. As infants are attracted to emotions, it is essential that we understand how infants process emotional information. While previous studies have shown that infants are capable of discriminating different kinds of emotions, no known study has been done to examine whether infants would be more sensitive to a change in facial expression or in vocal expression when they experience both facial and vocal expressions together. To examine this, infants were habituated to happy audiovisual displays. Infants then watched four audiovisual displays that were 1) the same happy audiovisual display, 2) audiovisual display with happy face and fearful voice, 3) audiovisual display with fearful face and happy voice, and 4) audiovisual display with fearful face and fearful voice. It was expected that infants would look longer when facial expression was changed than when vocal expression was change, but less than when both facial and vocal expressions were changed. However, the results showed that infants were equally sensitive to a change in facial expression, vocal expression, and both facial and vocal expressions.
2

Multisensory integration of social information in adult aging

Hunter, Edyta Monika January 2011 (has links)
Efficient navigation of our social world depends on the generation, interpretation and combination of social signals within different sensory systems. However, the influence of adult aging on cross-modal integration of emotional stimuli remains poorly understood. Therefore, the aim of this PhD thesis is to understand the integration of visual and auditory cues in social situations and how this is associated with other factors important for successful social interaction such as recognising emotions or understanding the mental states of others. A series of eight experiments were designed to compare the performance of younger and older adults on tasks related to multisensory integration and social cognition. Results suggest that older adults are significantly less accurate at correctly identifying emotions from one modality (faces or voices alone) but perform as well as younger adults on tasks where congruent auditory and visual emotional information are presented concurrently. Therefore, older adults appear to benefit from congruent multisensory information. In contrast, older adults are poorer than younger adults at detecting incongruency from different sensory modalities involved in decoding cues to deception, sarcasm or masking of emotions. It was also found that age differences in the processing of relevant and irrelevant visual and auditory social information might be related to changes in gaze behaviour. A further study demonstrated that the changes in behaviour and social interaction often reported in patients post-stroke might relate to problems in integrating the cross-modal social information. The pattern of findings is discussed in relation to social, emotional, neuropsychological and cognitive theories.
3

The role of temporal lobe structures in the attribution of affect and social cognition

Houghton, Judith Mary January 2000 (has links)
No description available.
4

The Influence of Victim Gender and Emotional Expression in Victim Impact Statements on Legal Judgments and Punishment Decisions

Chimowitz, Hannah 01 July 2021 (has links)
Victim impact statements (VISs) are written or oral statements detailing the effects a crime has had on a victim. While the practice of having victims present VISs at sentencing hearings has generated much debate for over 25 years, the effects of this practice on victims, defendants, and legal decision-makers remain poorly understood. Prior research suggests that a victim’s emotional expression can affect how victims are perceived, and the legal judgments made in response to their statements. The current research considers how the effects of victims’ emotional displays on sentencing decisions might be conditioned by victim gender. Using audio-recorded VIS stimuli, the present research investigated the influence of victim gender (male vs. female) and emotional expression (Study 1: anger vs. sadness; Study 2: anger vs. sadness vs. flat affect) on legal judgments and punishment decisions. The results across Study 1 and Study 2 are inconsistent, though findings from the study (Study 2) with the substantially larger sample size suggest that individuals make legal judgments that are more favorable towards female victims, regardless of the victim’s emotion expression in a VIS. However, hostile sexism and gender-emotion stereotype endorsement moderated the effects of victim emotion expression and gender on sentence severity and empathy for a defendant.
5

Understanding musical emotion: Exploring the interaction between cues, training, and interpretation

Battcock, Aimee January 2019 (has links)
Previous work on conveyed musical emotion has often focused on experimentally composed and manipulated music, or multi-lined music selected to express overt emotions. This highly controlled approach may overlook some aspects of the complex relationship between composers, performers, and listeners in transmitting emotional messages. My PhD research focuses on how listeners perceive emotion in music, specifically, how listeners interpret musical features such as timing, mode and pitch in complex musical stimuli. I also demonstrate how listeners with musical expertise use cues differently to perceive emotion and the effect of performer interpretation on this communication process. Throughout this dissertation I use a dimensional approach capturing perceived valence and arousal to assess complex musical stimuli. I adapted a technique used in other domains to music, affording an opportunity to explore nuanced relationships between cues and listener ratings of emotion. In Chapter 1 I show that musically untrained adults mainly use cues of timing and mode when rating emotional valence, mirroring previously reported. Additionally, I show that although pitch information emerges as a significant predictor of listener’s valence ratings, listeners use it less than cues such as timing and mode. Further, I demonstrate that neither mode nor pitch information help listeners rate perceived arousal. Finally, in Chapter 4, I show differences in performer interpretation mediate the strength of individual cues, as well as the distribution of emotional ratings across each album. In Chapter 3, I demonstrate that listeners with musical training use cues differently than untrained listeners, with more reliance on information communicated through mode when making judgements of emotional valence. Altogether these findings corroborate previous evidence suggesting timing and mode cues are of the greatest importance in conveying /perceiving emotion, this process is further mediated by individual differences in both pianist (interpretation) and listener (musical training)—underscoring the complex relationship between composer, performer, and audience. / Dissertation / Doctor of Philosophy (PhD) / Musical performers and composers express emotions through the selection and use of various musical features, or cues. Studies exploring how listeners perceive emotion in music have identified several cues important to this process—often using tightly controlled (and constrained) tone sequences crafted for experimental purposes. More work is needed to examine how listeners decode communicated emotion in unaltered passages created by renowned composers—the kind of music routinely performed and enjoyed by audiences for generations. Here in three sets of experiments I apply a novel stimulus set and analysis to determine the relative importance of three musical features. Additionally, I explore the role of the listener’s level of expertise as well as the importance of performers’ interpretative decisions. My work offers a new way to understand the relationship between musical features and emotional messages, helping to clarify one of music’s most mysterious and powerful capabilities.
6

Effects of Motion on Infants' Negativity Bias in Emotion Perception

Heck, Alison Rae 24 January 2013 (has links)
The negativity bias is a phenomenon that is characterized by infants being more influenced by, attending more to, and responding to more negative emotion information from the environment than positive emotion information. This study used a Tobii© T60 eye-tracking system to examine differences in 8- to 12-month-old infants' latencies to disengage from a centrally-presented face for three different emotion conditions-happy, sad, and fear. The events also varied by motion type-static versus dynamic. Additionally, infants' locomotor experience and parental affect served as two additional measures of experience, and assessed for their contributions to the infants' negativity bias. It was expected that infants would show longer latencies to disengage from the negative emotion events (fear or sad) compared to the positive emotion event (happy), but also that the latencies would be augmented by event type (dynamic > static), locomotion experience (high > low), and parental affect (higher negativity > lower negativity). Although infants showed more attention to dynamic than static emotion displays (especially on the speaker's mouth), and more attention to happy and sad compared to fear displays, no consistent effect of emotion type was found on infants' attention disengagement. Thus, no evidence for a negativity bias was seen. The results are interpreted with respect to possible contributions of the bimodal nature of emotion expression in the current study as well as age-related attentional differences in responding to a wide range of emotion cues. / Master of Science
7

Early Predictors of Variations in Children`s Emotion Understanding: Relations With Children`s Disruptive Behaviors

January 2011 (has links)
abstract: The purpose of this study was to examine the longitudinal relations of maternal behaviors, children`s temperamental negative emotionality, and children`s emotion perception processes, including emotion perception accuracy (EPA) and emotion perception bias (EPB), to children`s conduct disorder symptoms in a normative sample. Separate structural equation models were conducted to assess whether parenting or children`s proneness to negative emotions at 24-30 (T2), 36-42 (T3) and 48-54 (T4) months predicted children`s EPA and EPB over time, and whether T3 and T4 children`s emotion perception processes were predictive of children`s conduct disorder at 72 months of age (T5). None of the hypothesized longitudinal relations was supported; however, other noteworthy results were observed. T3 children`s proneness to negative emotions was positively related to children`s concurrent bias toward anger. The latent constructs of negative parenting, children`s proneness to negative emotions, and the observed measure of children`s emotion perception accuracy showed stability over time, whereas the observed measures of children`s bias toward understanding distinct negative emotions were unrelated across time. In addition, children`s expressive language was predicted by children`s earlier emotion perception accuracy, which emphasized the importance of improving children`s emotion understanding skills during early years. Furthermore, the previously established negative relation between EPA and EPB variables was only partially supported. Findings regarding the relations between parenting, children`s negative emotionality and emotion perception processes are discussed from a developmental perspective. / Dissertation/Thesis / M.S. Family and Human Development 2011
8

Neurocognition, Emotion Perception and Quality of Life in Schizophrenia

Aldebot, Stephanie 01 January 2009 (has links)
Patients with schizophrenia have extremely high levels of depression and suicide (Carlborg et al., 2008), thus, a better understanding of factors associated with poor quality of life (QoL) for this population is sorely needed. A growing body of research suggests that cognitive functioning in schizophrenia may be a strong predictor of overall QoL (Green et al., 2000), but individual domains of QoL have not been examined. Indirect evidence also suggests that emotion perception may underlie the relationship between neurocognition and QoL, but this hypothesis has also yet to be tested. Using a sample of 92 clinically stable schizophrenia patients, the current study explores the relationship between neurocognition, namely attention and working memory, and the following sub domains of QoL: social, vocational, intrapsychic foundations and environmental engagement. The current study also examines whether emotion perception mediates this relationship. In partial support of hypotheses, patients with more deficits in working memory reported decreased Occupational QoL and, although only marginally significant, decreased Total QoL. There was also a trend for poorer working memory to be associated with poorer Intrapsychic Foundations QoL. Contrary to hypotheses, emotion perception was not found to mediate the relationship between working memory and QoL. Current findings suggest that interventions that specifically target working memory may also improve many other aspects of schizophrenia patients? QoL.
9

Gaze Fixation during the Perception of Visual and Auditory Affective Cues

McManus, Susan M. 15 October 2009 (has links)
The accurate integration of audio-visual emotion cues is critical for social interactions and requires efficient processing of facial cues. Gaze behavior of typically developing young adults was measured via eye-tracking during the perception of dynamic audio-visual emotion (DAVE) stimuli. Participants were able to identify basic emotions (angry, fearful, happy, neutral) and determine the congruence of facial expression and prosody. Perception of incongruent videos resulted in increased reaction times and emotion identification consistent with the facial expression. Participants consistently demonstrated a featural processing approach across all tasks, with a significant preference for the eyes. Evidence of hemispheric lateralization was indicated by preferential fixation to the left (happy, angry) or right eye (fearful). Fixation patterns differed according to the facially expressed emotion, with the pattern that emerged during fearful movies supporting the significance of automatic threat processing. Finally, fixation pattern during the perception of incongruent movies varied according to task instructions.
10

Perception of Emotion from Facial Expression and Affective Prosody

Santorelli, Noelle Turini 09 June 2006 (has links)
Real-world perception of emotion results from the integration of multiple cues, most notably facial expression and affective prosody. The use of incongruent emotional stimuli presents an opportunity to study the interaction between sensory modalities. Thirty-seven participants were exposed to audio-visual stimuli (Robins & Schultz, 2004) including angry, fearful, happy, and neutral presentations. Eighty stimuli contain matching emotions and 240 contain incongruent emotional cues. Matching emotions elicited a significant number of correct responses for all four emotions. Sign tests indicated that for most incongruent conditions, participants demonstrated a bias towards the visual modality. Despite these findings, specific incongruent conditions did show evidence of blending. Future research should explore an evolutionary model of facial expression as a means for behavioral adaptation and the possibility of an “emotional McGurk effect” in particular combinations of emotions.

Page generated in 0.1095 seconds