• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 15
  • 3
  • 1
  • Tagged with
  • 28
  • 28
  • 20
  • 7
  • 6
  • 6
  • 5
  • 5
  • 5
  • 4
  • 4
  • 3
  • 3
  • 3
  • 3
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
11

Emotion processing after childhood Acquired Brain Injury (ABI) : an eye tracking study

Oliphant, Jenna January 2012 (has links)
Few studies have explored emotion processing abilities in children following Acquired Brain Injury (ABI). This study develops previous research in this area by exploring emotion processing skills in children with focal ABI, using eye tracking technology. It was hypothesised that children with focal ABI would demonstrate impaired emotion recognition abilities relative to a control group and that, similar to adult eye tracking studies, they would show an atypical pattern of eye moments when viewing faces. Sixteen participants with focal ABI (10-16 years) and 27 healthy controls (10-16 years) completed one novel and one adapted visual emotion processing task, presented using a T120 Tobii eye-tracker. The eye-tracker measured eye-movement fixations in three areas of interest (AOIs; eyes, nose, mouth), as participants viewed the stimuli. Emotion perception accuracy was recorded. All participants from the ABI group also completed neuropsychological assessment of their immediate visual memory, visual attention, visuospatial abilities, and everyday executive function. The results of the study showed no significant difference in accuracy between the ABI and control groups. However, on average children with ABI appeared slightly less accurate than the control group in both emotion recognition tasks. Within-subjects analysis revealed no effect of lesion location and laterality or age at lesion onset upon emotion recognition accuracy. Eye tracking analysis showed that children within the ABI group presented with an atypical pattern of eye movements relative to the control group, demonstrating significantly greater fixation times within the eye region, when viewing disgusted, fearful angry and happy faces. The ABI group also showed reduced mean percentage fixation duration within the nose and mouth regions, relative to controls. Furthermore, it was observed that the ABI group took longer on average to give an accurate response to sad, disgusted, happy and surprised faces and this difference reached statistical significance for the accurate recognition of happy and surprised faces. It is suggested that the atypical fixation patterns noted within the ABI group, may represent a difficulty with dividing visual attention rapidly across the whole of the face. This slowing may have an impact upon functioning in everyday social situations, where rapid processing and appraisal of emotion is thought to be particularly important. It is therefore suggested that eye tracking technology may be a valuable method for the identification of subtle difficulties in facial emotion processing, following focal ABI in childhood, and may also have an application in the rehabilitation of these difficulties in future.
12

Facial emotion expression, recognition and production of varying intensity in the typical population and on the autism spectrum

Wingenbach, Tanja January 2016 (has links)
The current research project aimed to investigate facial emotion processing from especially developed and validated video stimuli of facial emotional expressions including varying levels of intensity. Therefore, videos were developed showing real people expressing emotions in real time (anger, disgust, fear, sadness, surprise, happiness, contempt, embarrassment, contempt, and neutral) at different expression intensity levels (low, intermediate, high) called the Amsterdam Dynamic Facial Expression Set – Bath Intensity Variations (ADFES-BIV). The ADFES-BIV was validated on all its emotion and intensity categories. Sex differences in facial emotion recognition were investigated and a female advantage in facial emotion recognition was found compared to males. This demonstrates that the ADFES-BIV is suitable for investigating group comparisons in facial emotion recognition in the general population. Facial emotion recognition from the ADFES-BIV was further investigated in a sample of individuals that is characterised by deficits in social functioning; individuals with an Autism Spectrum Disorder (ASD). A deficit in facial emotion recognition was found in ASD compared to controls and error analysis revealed emotion-specific deficits in detecting emotional content from faces (sensitivity) next to deficits in differentiating between emotions from faces (specificity). The ADFES-BIV was combined with face electromyogram (EMG) to investigate facial mimicry and the effects of proprioceptive feedback (from explicit imitation and blocked facial mimicry) on facial emotion recognition. Based on the reverse simulation model it was predicted that facial mimicry would be an active component of the facial emotion recognition process. Experimental manipulations of face movements did not reveal an advantage of facial mimicry compared to the blocked facial mimicry condition. Whereas no support was found for the reverse simulation model, enhanced proprioceptive feedback can facilitate or hinder recognition of facial emotions in line with embodied cognition accounts.
13

Socio-emotional processing in children, adolescents and young adults with traumatic brain injury

Dendle, Jac Rhys January 2014 (has links)
Objective: Research has demonstrated deficits in socio-emotional processing following childhood traumatic brain injury (TBI; Tonks et al., 2009a). However, it is not known whether a link exists between socio-emotional processing, TBI and offending. Drawing on Ochsner’s (2008) socio-emotional processing model, the current study aimed to investigate facial emotion recognition accuracy and bias in young offenders with TBI. Setting: Research was conducted across three youth offender services. Participants: Thirty seven participants completed the study. Thirteen participants reported a high dosage of TBI. Design: The study had a cross sectional within and between subjects design. Main Measures: Penton-Voak and Munafo’s (2012) emotional recognition task was completed. Results: The results indicated that young offenders with a TBI were not significantly worse at facial emotion recognition compared to those with no TBI. Both groups showed a bias towards positive emotions. No between group differences were found for emotion bias. Conclusion: The findings did not support the use of Ochsner’s (2008) socio-emotional processing model for this population. Due to the small sample size, inadequate power and lack of non-offender control groups, it is not possible to draw any firm conclusions from the results of this study. Future research should aim to investigate whether there are any links between TBI, socio-emotional processing and offending.
14

AUTOMATED FACIAL EMOTION RECOGNITION: DEVELOPMENT AND APPLICATION TOHUMAN-ROBOT INTERACTION

Liu, Xiao 28 August 2019 (has links)
No description available.
15

Applying Facial Emotion Recognition to Usability Evaluations to Reduce Analysis Time

Chao, Gavin Kam 01 June 2021 (has links) (PDF)
Usability testing is an important part of product design that offers developers insight into a product’s ability to help users achieve their goals. Despite the usefulness of usability testing, human usability evaluations are costly and time-intensive processes. Developing methods to reduce the time and costs of usability evaluations is important for organizations to improve the usability of their products without expensive investments. One prospective solution to this is the application of facial emotion recognition to automate the collection of qualitative metrics normally identified by human usability evaluators. In this paper, facial emotion recognition (FER) was applied to mock usability recordings to evaluate how well FER could parse moments of emotional significance. To determine the accuracy of FER in this context, a FER Python library created by Justin Shenk was compared with data tags produced by human reporters. This study found that the facial emotion recognizer could only match its emotion recognition output with less than 40% of the human-reported emotion timestamps and less than 78% of the emotion data tags were recognized at all. The current lack of consistency with the human reported emotions found in this thesis makes it difficult to recommend using FER for parsing moments of semantic significance over conventional human usability evaluators.
16

Computational Techniques for Human Smile Analysis

Ugail, Hassan, Aldahoud, Ahmad A.A. 20 March 2022 (has links)
No / Explains how to implement computational techniques for human smile analysis Shares insights into the human personality traits hidden in a smile Enriches the understanding of human emotions through examples of face analysis Includes key examples of the practical use of computer based smile analysis.
17

Computational Techniques for Human Smile Analysis

Ugail, Hassan, Al-dahoud, Ahmad 20 March 2022 (has links)
No / How many times have you smiled today? How many times have you frowned today? Ever thought of being in a state of self-consciousness to be able to relate your own mood with your facial emotional expressions? Perhaps with our present-day busy lives, we may not consider these as crucial questions. However, as researchers uncover more and more about the human emotional landscape they are learning the importance of understanding our emotions.
18

Sex differences in cognition in Alzheimer's disease

Irvine, Karen January 2014 (has links)
Inspection of the published research shows that sex differences in cognition in the general population have been widely cited with the direction of the advantage depending on the domain being examined. The most prevalent claims are that men are better than women at visuospatial and mathematical tasks whereas women have superior verbal skills and perform better than men on tasks assessing episodic memory. There is also some evidence that women are more accurate than men at identifying facial expressions of emotion. A more in-depth examination of the literature, however, reveals that evidence of such differences is not as conclusive as would at first appear. Not only is the direction and magnitude of sex differences dependent on the cognitive domain but also on the individual tasks. Some visuospatial tasks show no difference (e.g. figure copying) whist men have been shown to be better than women at confrontation naming (a verbal task). Alzheimer’s disease is a heterogeneous illness that affects the elderly. It manifests with deficits in cognitive abilities and behavioural difficulties. It has been suggested that some of the behavioural issues may arise from difficulties with recognising facial emotion expressions. There have been claims that AD affects men and women differently: women have been reported as being more likely to develop AD and showing a greater dementia severity than men with equivalent neuropathology. Despite this, research into sex differences in cognition in AD is scarce, and conflicting. This research was concerned with the effect of sex on the cognitive abilities of AD patients. The relative performance of men and women with AD was compared to that of elderly controls. The study focused on the verbal, visuospatial and facial emotion recognition domains. Data was collected and analysed from 70 AD patients (33 male, 37 female), 62 elderly controls (31 male, 31 female) and 80 young adults (40 male, 40 female). Results showed those with AD demonstrate cognitive deficits compared to elderly controls in verbal and visuospatial tasks but not in the recognition of facial emotions. There were no significant sex differences in either the young adults or the healthy elderly controls but sex differences favouring men emerged in the AD group for figure copying and recall and for confrontation naming. Given that elderly men and women perform equivalently for these tasks, this represents a deterioration in women’s cognitive abilities, relative to men’s. Further evidence of such an adverse effect of AD was apparent in other tasks, too: for most verbal and visuospatial tasks, either an effect favouring women in the elderly is reversed or a male advantage increases in magnitude. There is no evidence of sex differences in facial emotion recognition for any group. This suggests that the lack of published findings reporting on sex differences in this domain is due to the difficulty in getting null findings accepted for publication. The scarcity of research examining sex differences in other domains is also likely to be due to this bias.
19

Um modelo para inferência do estado emocional baseado em superfícies emocionais dinâmicas planares. / A model for facial emotion inference based on planar dynamic emotional surfaces.

Ruivo, João Pedro Prospero 21 November 2017 (has links)
Emoções exercem influência direta sobre a vida humana, mediando a maneira como os indivíduos interagem e se relacionam, seja em âmbito pessoal ou social. Por essas razões, o desenvolvimento de interfaces homem-máquina capazes de manter interações mais naturais e amigáveis com os seres humanos se torna importante. No desenvolvimento de robôs sociais, assunto tratado neste trabalho, a adequada interpretação do estado emocional dos indivíduos que interagem com os robôs é indispensável. Assim, este trabalho trata do desenvolvimento de um modelo matemático para o reconhecimento do estado emocional humano por meio de expressões faciais. Primeiramente, a face humana é detectada e rastreada por meio de um algoritmo; então, características descritivas são extraídas da mesma e são alimentadas no modelo de reconhecimento de estados emocionais desenvolvidos, que consiste de um classificador de emoções instantâneas, um filtro de Kalman e um classificador dinâmico de emoções, responsável por fornecer a saída final do modelo. O modelo é otimizado através de um algoritmo de têmpera simulada e é testado sobre diferentes bancos de dados relevantes, tendo seu desempenho medido para cada estado emocional considerado. / Emotions have direct influence on the human life and are of great importance in relationships and in the way interactions between individuals develop. Because of this, they are also important for the development of human-machine interfaces that aim to maintain natural and friendly interactions with its users. In the development of social robots, which this work aims for, a suitable interpretation of the emotional state of the person interacting with the social robot is indispensable. The focus of this work is the development of a mathematical model for recognizing emotional facial expressions in a sequence of frames. Firstly, a face tracker algorithm is used to find and keep track of a human face in images; then relevant information is extracted from this face and fed into the emotional state recognition model developed in this work, which consists of an instantaneous emotional expression classifier, a Kalman filter and a dynamic classifier, which gives the final output of the model. The model is optimized via a simulated annealing algorithm and is experimented on relevant datasets, having its performance measured for each of the considered emotional states.
20

Um modelo para inferência do estado emocional baseado em superfícies emocionais dinâmicas planares. / A model for facial emotion inference based on planar dynamic emotional surfaces.

João Pedro Prospero Ruivo 21 November 2017 (has links)
Emoções exercem influência direta sobre a vida humana, mediando a maneira como os indivíduos interagem e se relacionam, seja em âmbito pessoal ou social. Por essas razões, o desenvolvimento de interfaces homem-máquina capazes de manter interações mais naturais e amigáveis com os seres humanos se torna importante. No desenvolvimento de robôs sociais, assunto tratado neste trabalho, a adequada interpretação do estado emocional dos indivíduos que interagem com os robôs é indispensável. Assim, este trabalho trata do desenvolvimento de um modelo matemático para o reconhecimento do estado emocional humano por meio de expressões faciais. Primeiramente, a face humana é detectada e rastreada por meio de um algoritmo; então, características descritivas são extraídas da mesma e são alimentadas no modelo de reconhecimento de estados emocionais desenvolvidos, que consiste de um classificador de emoções instantâneas, um filtro de Kalman e um classificador dinâmico de emoções, responsável por fornecer a saída final do modelo. O modelo é otimizado através de um algoritmo de têmpera simulada e é testado sobre diferentes bancos de dados relevantes, tendo seu desempenho medido para cada estado emocional considerado. / Emotions have direct influence on the human life and are of great importance in relationships and in the way interactions between individuals develop. Because of this, they are also important for the development of human-machine interfaces that aim to maintain natural and friendly interactions with its users. In the development of social robots, which this work aims for, a suitable interpretation of the emotional state of the person interacting with the social robot is indispensable. The focus of this work is the development of a mathematical model for recognizing emotional facial expressions in a sequence of frames. Firstly, a face tracker algorithm is used to find and keep track of a human face in images; then relevant information is extracted from this face and fed into the emotional state recognition model developed in this work, which consists of an instantaneous emotional expression classifier, a Kalman filter and a dynamic classifier, which gives the final output of the model. The model is optimized via a simulated annealing algorithm and is experimented on relevant datasets, having its performance measured for each of the considered emotional states.

Page generated in 0.1139 seconds