• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 2
  • Tagged with
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Behavioral and Neural Correlates of Speech Perception Outcomes in Adults with Cochlear Implants

Manning, Jacy 12 1900 (has links)
Postlingually deafened cochlear implant (CI) adults have large variability in speech perception abilities. While CIs are one of the most successful neural prosthetic devices, they are not able to adequately provide fine structure cues which results in a degraded signal for the listener to interpret. While behavioral measures remain the gold standard for determining speech perception abilities, an objective measure is needed for patients who are unable to provide reliable behavioral responses. Behavioral, cognitive, and neural measures were collected in this study to identify potential neural biomarkers that correlate with speech perception performance. Behavioral experiments evaluated participants' abilities to identify, discriminate, and recognize words as well as sentences in quiet and in noise. Cognitive measures were assessed to determine the roles of attention, impulse control, memory, and cognitive flexibility on speech recognition. Auditory event-related potentials (ERP) were obtained with a double oddball paradigm to produce the mismatch negativity (MMN) response, which has been shown to have associations with phonetic categorical perception at the group level. The results indicated that executive function is highly predictive of speech performance and that the MMN is associated with categorical perception at the individual level. These findings are clinically relevant to determining appropriate follow-up care post-implantation.
2

Posed and genuine smiles: an evoked response potentials study.

Ottley, Mark Carlisle January 2009 (has links)
The ability to recognise an individual's affective state from their facial expression is crucial to human social interaction. However, understanding of facial expression recognition processes is limited because mounting evidence has revealed important differences between posed and genuine facial expressions of emotion. Most previous studies of facial expression recognition have used only posed or simulated facial expressions as stimuli, but posed expressions do not reflect underlying affective state unlike genuine expressions. The current study compared behavioural responses and Evoked Response Potentials (ERPs) to neutral expressions, posed smiles and genuine smiles, during three different tasks. In the first task, no behavioural judgment was required, whereas participants were required to judge whether the person was showing happiness in the second task or feeling happiness in the third task. Behavioural results indicated that participants exhibited a high degree of sensitivity in detecting the emotional state of expressions. Genuine smiles were usually labelled as both showing and feeling happiness, but posed smiles were far less likely to be labelled as feeling happiness than as showing happiness. Analysis of P1 and N170 components, and later orbitofrontal activity, revealed differential activity levels in response to neutral expressions as compared to posed and genuine smiles. This differential activity occurred as early as 135ms at occipital locations and from 450ms at orbitofrontal locations. There were significant interactions between participant behavioural sensitivity to emotional state and P1 and N170 amplitudes. However, no significant difference in ERP activity between posed smiles and genuine smiles was observed until 850ms at orbitofrontal locations. An additional finding was greater right temporal and left orbitofrontal activation suggesting hemispheric asymmetry of facial expression processing systems.

Page generated in 0.0671 seconds