• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Cognitive resources in audiovisual speech perception

BUCHAN, JULIE N 11 October 2011 (has links)
Most events that we encounter in everyday life provide our different senses with correlated information, and audiovisual speech perception is a familiar instance of multisensory integration. Several approaches will be used to further examine the role of cognitive factors on audiovisual speech perception. The main focuses of this thesis will be to examine the influences of cognitive load and selective attention on audiovisual speech perception, as well as the integration of auditory and visual information in talking distractor faces. The influence of cognitive factors on the temporal integration of auditory and visual speech, and gaze behaviour during audiovisual speech will also be addressed. The overall results of the experiments presented here suggest that the integration of auditory and visual speech information is quite robust to various attempts to modulate the integration. Adding a cognitive load task shows minimal disruption of the integration of auditory and visual speech information. Changing attentional instructions to get subjects to selectively attend to either the auditory or visual speech information also has a rather modest influence on the observed integration of auditory and visual speech information. Generally, the integration of temporally offset auditory and visual information seems rather insensitive to cognitive load or selective attentional manipulations. The processing of visual information from distractor faces seems to be limited. The language of the visually articulating distractors doesn't appear to provide information that is helpful for matching together the auditory and visual speech streams. Audiovisual speech distractors are not really any more distracting than auditory distractor speech paired with a still image, suggesting a limited processing or integration of the visual and auditory distractor information. The gaze behaviour during audiovisual speech perception appears to be relatively unaffected by an increase in cognitive load, but is somewhat influenced by attentional instructions to selectively attend to the auditory and visual information. Additionally, both the congruency of the consonant, and the temporal offset of the auditory and visual stimuli have small but rather robust influences on gaze. / Thesis (Ph.D, Psychology) -- Queen's University, 2011-09-30 23:31:07.754

Page generated in 0.0928 seconds