• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 3
  • 2
  • Tagged with
  • 4
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Spatial Hearing with Simultaneous Sound Sources: A Psychophysical Investigation

Best, Virginia Ann January 2004 (has links)
This thesis provides an overview of work conducted to investigate human spatial hearing in situations involving multiple concurrent sound sources. Much is known about spatial hearing with single sound sources, including the acoustic cues to source location and the accuracy of localisation under different conditions. However, more recently interest has grown in the behaviour of listeners in more complex environments. Concurrent sound sources pose a particularly difficult problem for the auditory system, as their identities and locations must be extracted from a common set of sensory receptors and shared computational machinery. It is clear that humans have a rich perception of their auditory world, but just how concurrent sounds are processed, and how accurately, are issues that are poorly understood. This work attempts to fill a gap in our understanding by systematically examining spatial resolution with multiple sound sources. A series of psychophysical experiments was conducted on listeners with normal hearing to measure performance in spatial localisation and discrimination tasks involving more than one source. The general approach was to present sources that overlapped in both frequency and time in order to observe performance in the most challenging of situations. Furthermore, the role of two primary sets of location cues in concurrent source listening was probed by examining performance in different spatial dimensions. The binaural cues arise due to the separation of the two ears, and provide information about the lateral position of sound sources. The spectral cues result from location-dependent filtering by the head and pinnae, and allow vertical and front-rear auditory discrimination. Two sets of experiments are described that employed relatively simple broadband noise stimuli. In the first of these, two-point discrimination thresholds were measured using simultaneous noise bursts. It was found that the pair could be resolved only if a binaural difference was present; spectral cues did not appear to be sufficient. In the second set of experiments, the two stimuli were made distinguishable on the basis of their temporal envelopes, and the localisation of a designated target source was directly examined. Remarkably robust localisation was observed, despite the simultaneous masker, and both binaural and spectral cues appeared to be of use in this case. Small but persistent errors were observed, which in the lateral dimension represented a systematic shift away from the location of the masker. The errors can be explained by interference in the processing of the different location cues. Overall these experiments demonstrated that the spatial perception of concurrent sound sources is highly dependent on stimulus characteristics and configurations. This suggests that the underlying spatial representations are limited by the accuracy with which acoustic spatial cues can be extracted from a mixed signal. Three sets of experiments are then described that examined spatial performance with speech, a complex natural sound. The first measured how well speech is localised in isolation. This work demonstrated that speech contains high-frequency energy that is essential for accurate three-dimensional localisation. In the second set of experiments, spatial resolution for concurrent monosyllabic words was examined using similar approaches to those used for the concurrent noise experiments. It was found that resolution for concurrent speech stimuli was similar to resolution for concurrent noise stimuli. Importantly, listeners were limited in their ability to concurrently process the location-dependent spectral cues associated with two brief speech sources. In the final set of experiments, the role of spatial hearing was examined in a more relevant setting containing concurrent streams of sentence speech. It has long been known that binaural differences can aid segregation and enhance selective attention in such situations. The results presented here confirmed this finding and extended it to show that the spectral cues associated with different locations can also contribute. As a whole, this work provides an in-depth examination of spatial performance in concurrent source situations and delineates some of the limitations of this process. In general, spatial accuracy with concurrent sources is poorer than with single sound sources, as both binaural and spectral cues are subject to interference. Nonetheless, binaural cues are quite robust for representing concurrent source locations, and spectral cues can enhance spatial listening in many situations. The findings also highlight the intricate relationship that exists between spatial hearing, auditory object processing, and the allocation of attention in complex environments.
2

Spatial Hearing with Simultaneous Sound Sources: A Psychophysical Investigation

Best, Virginia Ann January 2004 (has links)
This thesis provides an overview of work conducted to investigate human spatial hearing in situations involving multiple concurrent sound sources. Much is known about spatial hearing with single sound sources, including the acoustic cues to source location and the accuracy of localisation under different conditions. However, more recently interest has grown in the behaviour of listeners in more complex environments. Concurrent sound sources pose a particularly difficult problem for the auditory system, as their identities and locations must be extracted from a common set of sensory receptors and shared computational machinery. It is clear that humans have a rich perception of their auditory world, but just how concurrent sounds are processed, and how accurately, are issues that are poorly understood. This work attempts to fill a gap in our understanding by systematically examining spatial resolution with multiple sound sources. A series of psychophysical experiments was conducted on listeners with normal hearing to measure performance in spatial localisation and discrimination tasks involving more than one source. The general approach was to present sources that overlapped in both frequency and time in order to observe performance in the most challenging of situations. Furthermore, the role of two primary sets of location cues in concurrent source listening was probed by examining performance in different spatial dimensions. The binaural cues arise due to the separation of the two ears, and provide information about the lateral position of sound sources. The spectral cues result from location-dependent filtering by the head and pinnae, and allow vertical and front-rear auditory discrimination. Two sets of experiments are described that employed relatively simple broadband noise stimuli. In the first of these, two-point discrimination thresholds were measured using simultaneous noise bursts. It was found that the pair could be resolved only if a binaural difference was present; spectral cues did not appear to be sufficient. In the second set of experiments, the two stimuli were made distinguishable on the basis of their temporal envelopes, and the localisation of a designated target source was directly examined. Remarkably robust localisation was observed, despite the simultaneous masker, and both binaural and spectral cues appeared to be of use in this case. Small but persistent errors were observed, which in the lateral dimension represented a systematic shift away from the location of the masker. The errors can be explained by interference in the processing of the different location cues. Overall these experiments demonstrated that the spatial perception of concurrent sound sources is highly dependent on stimulus characteristics and configurations. This suggests that the underlying spatial representations are limited by the accuracy with which acoustic spatial cues can be extracted from a mixed signal. Three sets of experiments are then described that examined spatial performance with speech, a complex natural sound. The first measured how well speech is localised in isolation. This work demonstrated that speech contains high-frequency energy that is essential for accurate three-dimensional localisation. In the second set of experiments, spatial resolution for concurrent monosyllabic words was examined using similar approaches to those used for the concurrent noise experiments. It was found that resolution for concurrent speech stimuli was similar to resolution for concurrent noise stimuli. Importantly, listeners were limited in their ability to concurrently process the location-dependent spectral cues associated with two brief speech sources. In the final set of experiments, the role of spatial hearing was examined in a more relevant setting containing concurrent streams of sentence speech. It has long been known that binaural differences can aid segregation and enhance selective attention in such situations. The results presented here confirmed this finding and extended it to show that the spectral cues associated with different locations can also contribute. As a whole, this work provides an in-depth examination of spatial performance in concurrent source situations and delineates some of the limitations of this process. In general, spatial accuracy with concurrent sources is poorer than with single sound sources, as both binaural and spectral cues are subject to interference. Nonetheless, binaural cues are quite robust for representing concurrent source locations, and spectral cues can enhance spatial listening in many situations. The findings also highlight the intricate relationship that exists between spatial hearing, auditory object processing, and the allocation of attention in complex environments.
3

The perceived timing of events across different sensory modalities : a psychophysical investigation of multisensory time perception in humans

Hanson, James Vincent Michael January 2009 (has links)
The experiments reported within this thesis use psychophysical techniques to examine the factors which determine perceived multisensory timing in humans. Chapters 1 and 2 describe anatomical and psychophysical features of temporal processing, respectively, whilst Chapter 3 introduces the reader to psychophysical methods. Chapter 4 examines the relationship between two measures of sensory latency, reaction time (RT) and crossmodal temporal order judgment (TOJ). Despite task and attentional manipulations the two measures do not correlate, suggesting that they measure some fundamentally different aspect(s) of temporal perception. Chapter 5 examines the effects of adaptation to asynchronous stimulus pairs on perceived audiovisual (AV), audiotactile (AT) and visuotactile (VT) temporal order. Significant temporal shifts are recorded in all three conditions. Evidence is also presented showing that crossmodal TOJs are intransitive. Chapter 6 shows that concurrent adaptation to two sets of asynchronous AV stimulus pairs causes perceived AV temporal order to recalibrate at two locations simultaneously, and that AV asynchrony adaptation effects are significantly affected by observers' attention during adaptation. Finally, Chapter 7 shows that when observers are accustomed to a physical delay between motor actions and sensory events, an event presented at a reduced delay appears to precede the causative motor action. The data are well-described by a simple model based on a strong prior assumption of physical synchrony between motor actions and their sensory consequences.
4

The perceived timing of events across different sensory modalities. A psychophysical investigation of multisensory time perception in humans.

Hanson, James Vincent Michael January 2009 (has links)
The experiments reported within this thesis use psychophysical techniques to examine the factors which determine perceived multisensory timing in humans. Chapters 1 and 2 describe anatomical and psychophysical features of temporal processing, respectively, whilst Chapter 3 introduces the reader to psychophysical methods. Chapter 4 examines the relationship between two measures of sensory latency, reaction time (RT) and crossmodal temporal order judgment (TOJ). Despite task and attentional manipulations the two measures do not correlate, suggesting that they measure some fundamentally different aspect(s) of temporal perception. Chapter 5 examines the effects of adaptation to asynchronous stimulus pairs on perceived audiovisual (AV), audiotactile (AT) and visuotactile (VT) temporal order. Significant temporal shifts are recorded in all three conditions. Evidence is also presented showing that crossmodal TOJs are intransitive. Chapter 6 shows that concurrent adaptation to two sets of asynchronous AV stimulus pairs causes perceived AV temporal order to recalibrate at two locations simultaneously, and that AV asynchrony adaptation effects are significantly affected by observers¿ attention during adaptation. Finally, Chapter 7 shows that when observers are accustomed to a physical delay between motor actions and sensory events, an event presented at a reduced delay appears to precede the causative motor action. The data are well-described by a simple model based on a strong prior assumption of physical synchrony between motor actions and their sensory consequences.

Page generated in 0.1977 seconds