• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 12
  • 1
  • 1
  • 1
  • Tagged with
  • 21
  • 21
  • 11
  • 7
  • 4
  • 4
  • 4
  • 4
  • 4
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Visual and Auditory Perception of Emotion in Adolescents with Bipolar Disorder

Foster, Mary Kristin 03 April 2007 (has links)
No description available.
2

Recognition of Facial Affect in individuals scoring high and low on Psychopathic Personality Characteristics

Ali, Afiya January 2007 (has links)
The accuracy of perception of facial emotion expressions was studied in individuals with low and high psychopathic personality characteristics in a sample of 21 male and 39 female university students. Participants completed the Psychopathic Personality Inventory (PPI), and the Behavioural Inhibition Scale and the Behavioural Activation Scale (BIS/BAS) as measures of psychopathy. Participants completed a computerised emotion recognition task containing six emotions of facial expressions (each emotion had five different intensities). The results showed that participants scoring low on the BIS and high on the BAS scores showed significant impairments in the recognition of both sad and fearful expressions. On the other hand, group scoring high on the PPI, showed significant impairment in the recognition of angry, but not fearful or sad expressions in the total sample. Males with high psychopathic personality characteristics showed significant impairments in the recognition of sad, fearful and angry expressions compared to males with low psychopathic personality characteristics. On the other hand females with high psychopathic personality characteristics showed significant impairment in recognising the expression of disgust only compared to females with low psychopathic personality characteristics. The PPI and the BIS/BAS scales showed reasonable alpha reliabilities with some exceptions for one subscale in each measure. Correlations between the PPI and the BIS/BAS scales were weak to moderate. The current findings suggest that different dimensions of psychopathy may be associated with selective impairments in recognising unpleasant emotion expressions in others.
3

The detection of concealed firearm carrying through CCTV : the role of affect recognition

Blechko, Anastassia January 2011 (has links)
This research aimed to explore whether the recognition of offenders with a concealed firearm by a human operator might be based on the recognition of affective (negative) state derived from non-verbal behaviour that is accessible from CCTV images. Since a firearm is concealed, it has been assumed that human observers would respond to subtle cues which individuals inherently produce whilst carrying a hidden firearm. These cues are believed to be reflected in the body language of those carrying firearms and might be apprehended by observers at a conscious or subconscious level. Another hypothesis is that the ability to recognize the carrier of concealed firearm in the CCTV footage might be affected by other factors, such as the skills in decoding an affective state of others and the viewpoint of observation of the surveillance targets. In order to give a theoretical and experimental basis for these hypotheses the first objective was to examine the extant literature to determine what is known about recognition of affect from non-verbal cues (e.g. facial expressions and body movement), and how it can be applied to the detection of human mal-intent. A second objective was to explore this subject in relation to the detection of concealed firearm carrying through performing a number of experimental studies. The studies employed experts, i.e. CCTV operators and mainly the lay people as participants. Also, various experimental techniques such as questionnaires and eye-tracking registration were used to investigate the topic. The results show that human observers seem to use visual indicators of affective state of surveillance targets to make a decision whether or not the individuals are carrying a concealed firearm. The most prominent cues were face, and upper body of surveillance targets, gait, posture and arm movements. The test of decoding ability did not show sufficient relationship with the ability to detect a concealed firearm bearer. The performance on the task might be view dependent. Further research into this topic will be needed to generate strategies that would support reliable detection of concealed firearm carrying through employing of related affective behavioural cues.
4

Facial affect recognition in psychosis

Bordon, Natalie Sarah January 2016 (has links)
While a correlation between suffering from psychosis and an increased risk of engaging in aggressive behaviours has been established, many factors have been explored which may contribute to increasing this risk. Patients with a diagnosis of psychosis have been shown to have significant difficulties in facial affect recognition (FAR) and some authors have proposed that this may contribute to increasing the risk of displaying aggressive or violent behaviours. A systematic review of the current evidence regarding the links between facial affect recognition and aggression was conducted. Results were varied with some studies providing evidence of a link between emotion recognition difficulties and aggression, while others were unable to establish such an association. Results should be interpreted with some caution as the quality of included studies was poor due to small sample sizes, insufficient power and limited reporting of results. Adequately powered, randomised controlled studies using appropriate blinding procedures and validated measures are therefore required. There is a substantial evidence base demonstrating difficulties in emotional perception in patients with psychosis, with evidence suggesting a relationship with reduced social functioning, increased aggression and more severe symptoms of psychosis. In this review we aim to review this field to assess if there is a causal link between facial affect recognition difficulties and psychosis. The Bradford Hill criteria for establishing a causal relationship from observational data were used to generate key hypotheses, which were then tested against existing evidence. Where a published meta-analysis was not already available, new meta-analyses were conducted. A large effect of FAR difficulties in those with a diagnosis of psychosis, with a small to moderate correlation between FAR problems and symptoms of psychosis was found. Evidence was provided for the existence of FAR problems in those at clinical high risk of psychosis, while remediation of psychosis symptoms did not appear to impact FAR difficulties. There appears to be good evidence of the existence of facial affect recognition difficulties in the causation of psychosis, though larger, longitudinal studies are required to provide further evidence of this.
5

Social cognition in antisocial populations

Bratton, Helen January 2015 (has links)
Introduction: Impairments in facial affect recognition have been linked to the development of various disorders. The aim of the current work is to conduct a systematic review and meta-analysis of studies examining whether this ability is impaired in males with psychopathy or antisocial traits, when compared to healthy individuals. Method: Studies were eligible for inclusion if they compared facial affect recognition in either a) psychopathic vs. antisocial males, b) psychopathic vs. healthy controls and c) antisocial vs. healthy controls. Primary outcomes were group differences in overall emotion recognition, fear recognition, and sadness recognition. Secondary outcomes were differences in recognition of disgust, happiness, surprise and anger. Results: Fifteen papers comprising 214 psychopathic males, 491 antisocial males and 386 healthy community controls were identified. In psychopathy, limited evidence suggested impairments in fear (k=2), sadness (k=1) and surprise (k=1) recognition relative to healthy individuals, but overall affect recognition ability was not affected (k=2). Findings were inconclusive for antisocial (k=4-6), although impairments in surprise (k=4) and disgust (k=5) recognition were observed. Psychopathic and antisocial samples did not differ in their ability to detect sadness (k=4), but psychopaths were less able to recognise happiness (k=4) and surprise (k=3). Conclusion: Limited evidence suggests psychopathic and antisocial personality traits are associated with small to moderate deficits in specific aspects of emotion recognition. However considerable heterogeneity was identified, and study quality was often poor. Adequately powered studies using validated assessment measures, rater masking and a priori public registration of hypotheses and methods are required.
6

7- and 12-Month-Olds' Intermodal Recognition of Affect: 7-Month-Olds are "Smarter" than 12-Month-Olds

Whiteley, Mark Oborn 30 June 2011 (has links) (PDF)
Research has shown that by 7-months of age infants demonstrate recognition of emotion by successfully matching faces and voices based on affect in an intermodal matching procedure. It is often assumed that once an ability is present the development of that ability has "ceased." Therefore, no research has examined if and how the ability to match faces and voices based on affect develops after the first 7-months. This study examined how the ability to match faces and voices based on affect changes from 7- to 12-months. Looking at infant's proportion of total looking time (PTLT) results showed that, consistent with previous research, 7-month-old infants looked significantly longer at the affectively congruent facial expression. However, 12-month- olds showed no matching of faces and voices. Further analyses showed that 7-month-olds also increased their looking to facial expressions while being presented with the affectively congruent vocal expression. Once again, 12-month-olds failed to show significant matching. That 7-month- olds were able to demonstrate matching while 12-month-olds failed to do so is possibly a result of 12-month-olds attending to other information. More research is needed to better understand how infants' recognition of affect and overall perceptual abilities change as they develop.
7

Facial Emotion Recognition In Children With Asperger's Disorder And In Children With Social Phobia

Wong, Nina 01 January 2010 (has links)
Recognizing emotion from facial expressions is an essential skill for effective social functioning and establishing interpersonal relationships. Asperger's Disorder (AD) and Social Phobia (SP) are two clinical populations showing impairment in social skill and perhaps emotion recognition. Objectives: The primary objectives were to determine the uniqueness of facial emotion recognition abilities between children with AD and SP relative to typically developing children (TD) and to examine the role of expression intensity in determining recognition of facial affect. Method: Fifty-seven children (19 AD, 17 SP, and 21 TD) aged 7-13 years participated in the study. Reaction times and accuracy were measured as children identified neutral faces and faces displaying anger, disgust, fear, happiness, and sadness at two different intensity levels. Results: Mixed model ANOVAs with group and emotion type revealed that all children responded faster and more accurately to expressions of happiness, but there were no other group differences. Additional analyses indicated that intensity of the displayed emotion influenced facial affect detection ability for several basic emotions (happiness, fear, and anger). Across groups, there was no pattern of specific misidentification of emotion (e.g., children did not consistently misidentify one emotion, such as disgust, for a different emotion, such as anger.) Finally, facial affect recognition abilities were not associated with behavioral ratings of overall anxiety or social skills effectiveness in structured role play interactions. Conclusions: Distinct facial affect recognition deficits in the clinical groups emerge when the intensity of the emotion expression is considered. Implications for using behavioral assessments to delineate the relationship between facial affect recognition abilities and social functioning among clinical populations are discussed.
8

Social cognition deficits and violence in people with a diagnosis of schizophrenia

Langham, Heather January 2015 (has links)
Introduction It is widely reported that people with schizophrenia have social cognition deficits. In addition to their negative impact on functioning and quality of life, these deficits may also contribute to the use of violence. It has recently been established that social cognitive interventions (SCIs) can ameliorate deficits in facial affect recognition (FAR). This project aimed to systematically review whether SCIs can also improve theory of mind (ToM) abilities in people with schizophrenia. The empirical study aimed to explore whether the extent of the deficits in FAR and ToM in people with schizophrenia differed between those with and without a substantial history of violence. Method A systematic review was undertaken to identify studies where SCIs were provided to adults with schizophrenia or schizoaffective disorder. Key findings were highlighted with the quality of the studies’ methodology and reporting assessed. A quantitative research study was also undertaken involving 22 men aged 18-64 with a diagnosis of schizophrenia or schizoaffective disorder, comparing those with and without a substantial history of violence (SHV) on measures of FAR and ToM. Results The majority of the 13 studies included in the systematic review found that the provision of SCIs led to significant improvements in ToM. However, all studies demonstrated a potential for bias and were limited by inadequate sample size. In the empirical study, less than half of participants scored within the normal range for overall FAR ability, with no difference identified between the SHV and no-SHV group. However, the SHV group were poorer at recognising sadness and showed a tendency to perform better at the detection of faux pas, compared to the no-SHV group. Conclusions The systematic review identified that a wide range of SCIs can improve ToM abilities in people with schizophrenia. Its findings highlight that stringent, adequately powered studies should be undertaken, utilising standardised assessments of a range of levels of ToM ability, to enable identification of the most effective intervention. The findings of the empirical study are limited by a small and imbalanced sample size between groups and so must be interpreted with caution. However, patterns observed in the results highlight areas for further exploration. The strengths of this study’s design and recruitment challenges are discussed.
9

Modeling User Affect Using Interaction Events

Alhothali, Areej 20 June 2011 (has links)
Emotions play a significant role in many human mental activities, including decision-making, motivation, and cognition. Various intelligent and expert systems can be empowered with emotionally intelligent capabilities, especially systems that interact with humans and mimic human behaviour. However, most current methods in affect recognition studies use intrusive, lab-based, and expensive tools which are unsuitable for real-world situations. Inspired by studies on keystrokes dynamics, this thesis investigates the effectiveness of diagnosing users’ affect through their typing behaviour in an educational context. To collect users’ typing patterns, a field study was conducted in which subjects used a dialogue-based tutoring system built by the researcher. Eighteen dialogue features associated with subjective and objective ratings for users’ emotions were collected. Several classification techniques were assessed in diagnosing users’ affect, including discrimination analysis, Bayesian analysis, decision trees, and neural networks. An artificial neural network approach was ultimately chosen as it yielded the highest accuracy compared with the other methods. To lower the error rate, a hierarchical classification was implemented to first classify user emotions based on their valence (positive or negative) and then perform a finer classification step to determining which emotions the user experienced (delighted, neutral, confused, bored, and frustrated). The hierarchical classifier was successfully able to diagnose users' emotional valence, while it was moderately able to classify users’ emotional states. The overall accuracy obtained from the hierarchical classifier significantly outperformed previous dialogue-based approaches and in line with some affective computing methods.
10

Inferring Speaker Affect in Spoken Natural Language Communication

Pon-Barry, Heather Roberta 15 March 2013 (has links)
The field of spoken language processing is concerned with creating computer programs that can understand human speech and produce human-like speech. Regarding the problem of understanding human speech, there is currently growing interest in moving beyond speech recognition (the task of transcribing the words in an audio stream) and towards machine listening—interpreting the full spectrum of information in an audio stream. One part of machine listening, the problem that this thesis focuses on, is the task of using information in the speech signal to infer a person’s emotional or mental state. In this dissertation, our approach is to assess the utility of prosody, or manner of speaking, in classifying speaker affect. Prosody refers to the acoustic features of natural speech: rhythm, stress, intonation, and energy. Affect refers to a person’s emotions and attitudes such as happiness, frustration, or uncertainty. We focus on one specific dimension of affect: level of certainty. Our goal is to automatically infer whether a person is confident or uncertain based on the prosody of his or her speech. Potential applications include conversational dialogue systems (e.g., in educational technology) and voice search (e.g., smartphone personal assistants). There are three main contributions of this thesis. The first contribution is a method for eliciting uncertain speech that binds a speaker’s uncertainty to a single phrase within the larger utterance, allowing us to compare the utility of contextually-based prosodic features. Second, we devise a technique for computing prosodic features from utterance segments that both improves uncertainty classification and can be used to determine which phrase a speaker is uncertain about. The level of certainty classifier achieves an accuracy of 75%. Third, we examine the differences between perceived, self-reported, and internal level of certainty, concluding that perceived certainty is aligned with internal certainty for some but not all speakers and that self-reports are a good proxy for internal certainty. / Engineering and Applied Sciences

Page generated in 0.0947 seconds