• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • 1
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Improving facial expression recognition in children with autism spectrum disorder: effectiveness of a computer assisted intervention

Murphy, Patrick N. 04 October 2017 (has links)
Evidence suggests that computer assisted interventions (CAI) have advantages over other types of instruction when teaching children with Autism Spectrum Disorder (ASD). A growing number of technology based tools for use in educational settings have been developed to address specific deficits associated with ASD; namely poor facial expression recognition. Given the proliferation of CAIs, there is an urgent need to test their application in real world and clinical settings. Based on previous research on the success of CAIs to support children with ASD in this area, this research was developed as a small scale pilot study to explore the feasibility and potential educational benefits of the relatively new CAI; Let’s Face It! Scrapbook (LFI!). This study examined the viability of the LFI! program in a clinical setting in which two groups of children with ASD worked one-on-one with behavioural interventionists to develop necessary life skills. The experimental condition (n=3) which received natural environment teaching (NET) of emotions plus LFI! exercises preformed better on tasks of facial expressions recognition in post-tests than the control condition (n=3) which received only natural environment teaching. Participating behavioural interventionists reporting on their experiences using the app. preferred this method of teaching citing the greater available teaching material, the enriched level of engagement required between client and interventionist, and the fun nature of the program. Though small in nature, the results of this pilot study would suggest that the LFI! program is a viable tool for use when training facial expression recognition with clients with ASD in clinical settings. / Graduate
2

Perception des émotions non verbales dans la musique, les voix et les visages chez les adultes implantés cochléaires présentant une surdité évolutive / Perception of non verbal emotions before and after cochlear implantation in adults with progressive deafness

Ambert-Dahan, Emmanuèle 11 July 2014 (has links)
Le bénéfice de l’implant cochléaire pour la compréhension de la parole en milieu calme, et même dans certains cas pour des situations auditives complexes telles que les environnements bruyants ou l’écoute de la musique est aujourd’hui connu. Si la compréhension de la parole est nécessaire à la communication, la perception des informations non verbales transmises par la voix de même que des expressions faciales est fondamentale pour interpréter le message d’un interlocuteur. Les capacités de perception des émotions non verbales en cas de surdité neurosensorielle évolutive ont été très peu explorées. Les travaux menés dans cette thèse ont pour objectifs d’évaluer la reconnaissance des émotions non verbales dans les modalités auditive et visuelle afin de mettre en évidence d’éventuelles spécificités chez les adultes présentant une surdité évolutive. Pour cela, nous avons réalisé quatre études comportementales dans lesquelles nous avons comparé leurs performances à celles de sujets contrôles normo-entendants. Nous avons évalué le jugement des émotions portées par la musique, la voix et les visages à partir d’un paradigme expérimental impliquant la reconnaissance de catégories émotionnelles (i.e. joie, peur, tristesse...) et la perception des dimensions de valence et d’éveil de l’émotion exprimée. Les études 1 et 2 ont porté sur la reconnaissance des émotions auditives après implantation cochléaire en examinant tour à tour la reconnaissance des émotions portées par la musique et la reconnaissance de celles portées par la voix. Les études 3 et 4 ont porté sur la reconnaissance des émotions visuelles et, en particulier, des expressions faciales avant et après implantation cochléaire. Les résultats de ces études révèlent l’existence d’un déficit de reconnaissance des émotions plus marqué dans le domaine musical et vocal que facial. Il apparaît aussi une perturbation des jugements d'éveil, les stimuli étant perçus moins excitants par les patients que par les normo-entendants. Toutefois, la reconnaissance des voix et des musiques, bien que limitée, était supérieure au niveau du hasard démontrant les bénéfices de l'implant cochléaire pour le traitement des émotions auditives. En revanche, quelle que soit la modalité étudiée, les jugements de valence n'étaient pas altérés. De manière surprenante, les données de ces recherches suggèrent de plus que, chez une partie des patients testés, la reconnaissance des émotions faciales peut être affectée par la survenue d'une surdité évolutive suggérant les conséquences de la perte auditive sur le traitement des émotions présentées dans une autre modalité. En conclusion, il semblerait que la surdité, de même que l'insuffisance d’informations spectrales transmises par l’implant cochléaire, favorisent l'utilisation de la communication verbale au détriment de la communication non verbale. / While cochlear implantation is quite successful in restoring speech comprehension in quiet environments other auditory tasks, such as communication in noisy environments or music perception remain very challenging for cochlear implant (CI) users. Communication involves multimodal perception since information is transmitted by vocal and facial expressions which are crucial to interpret speaker’s emotional state. Indeed, very few studies have examined perception of non verbal emotions in case of progressive neurosensorial hearing loss in adults. The aim of this thesis was to test the influence of rehabilitation by CI after acquired deafness on emotional judgment of musical excerpts and in non verbal voices. We also examined the influence of acquired post-lingual progressive deafness on emotional judgment of faces. For this purpose, we conducted four experimental studies in which performances of deaf and cochlear implanted subjects were compared to those of normal hearing controls. To assess emotional judgment in music, voices and faces, we used a task that consisted of emotional categories identification (happiness, fear, anger or peacefulness for music and neutral) and dimensional judgment of valence and arousal. The first two studies evaluated emotional perception in auditory modality by successively examining recognition of emotions in music and voices. The two following studies focused on emotion recognition in visual modality, particularly on emotional facial expressions before and after cochlear implantation. Results of these studies revealed greater deficits in emotion recognition in the musical and vocal than visual domains as well as a disturbance of arousal judgments, stimuli being perceived less exciting by CI patients as compared to NH subjects. Yet, recognition of emotions in music and voices, although limited, was performed above chance level demonstrating CI benefits for auditory emotions processing. Conversely, valence judgments were not impaired in music, vocal and facial emotional tests. Surprisingly, results of these studies suggest that, at least for a sub-group of patients, recognition of facial emotions is affected by acquired deafness indicating the consequences of progressive hearing loss in processing emotion presented in another modality. Thus, it seems that progressive deafness as well as the lack of spectral cues transmitted by the cochlear implant might foster verbal communication to the detriment of the non verbal emotional communication.

Page generated in 0.1514 seconds