• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 294
  • 147
  • 36
  • 36
  • 36
  • 36
  • 36
  • 36
  • 16
  • 13
  • 8
  • 8
  • 8
  • 8
  • 7
  • Tagged with
  • 620
  • 620
  • 144
  • 143
  • 143
  • 118
  • 78
  • 60
  • 53
  • 50
  • 44
  • 42
  • 42
  • 41
  • 41
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
461

Efficient Bone Conduction Hearing Device with a Novel Piezoelectric Transducer Using Skin as an Electrode / 皮膚を電極とする新たな圧電素子を用いた骨導補聴器の開発

Furuta, Ichiro 24 November 2022 (has links)
京都大学 / 新制・課程博士 / 博士(医学) / 甲第24286号 / 医博第4902号 / 新制||医||1061(附属図書館) / 京都大学大学院医学研究科医学専攻 / (主査)教授 森本 尚樹, 教授 辻川 明孝, 教授 渡邉 大 / 学位規則第4条第1項該当 / Doctor of Medical Science / Kyoto University / DFAM
462

Shaping Sound | Tuning Architecture in the Soniferous Garden

Konsen, Andrei K. January 2010 (has links)
No description available.
463

Development of a Prototype Auditory Training Computer Program for Hearing Impaired Preschoolers

Doster, Leslie R. 01 January 1987 (has links) (PDF)
A computer program which pairs auditory stimuli with visual stimuli was developed for the purpose of providing auditory training for the hearing impaired. It utilizes a Texas Instruments 99 /4A computer and Extended BASIC programming language which allows considerable graphics and sound capability. The lessons make full use of the sixteen colors available and the sound is provided three ways: Texas Instruments speech synthesizer, the computer itself (musical tones and noise), and by tape recorder which is controlled by the computer. Focus of the lessons, which are designed for children ages three to five, is awareness of sound, environmental sounds, discrimination of changes in pitch and duration of sound, recognition of rhythm, and early language learning. At this beginning level, the program is primarily teaching by pairing the stimuli repeatedly, but there are some higher level tasks requiring input from the child to identify a stimulus.
464

Identification of auditory sequences by hearing-impaired and normal-hearing children

Ling, Agnes H. January 1972 (has links)
No description available.
465

The neural correlates of memory for nonlinguistic emotional vocalizations using structural neuroimaging techniques /

Chochol, Caroline. January 2008 (has links)
No description available.
466

A computer-assisted method for training and researching timbre memory and evaluation skills /

Quesnel, René. January 2001 (has links)
No description available.
467

Recalibration of perceived time across sensory modalities

Hanson, James Vincent Michael, Heron, James, Whitaker, David J. January 2008 (has links)
No / When formulating an estimate of event time, the human sensory system has been shown to possess a degree of perceptual flexibility. Specifically, the perceived relative timing of auditory and visual stimuli is, to some extent, a product of recent experience. It has been suggested that this form of sensory recalibration may be peculiar to the audiovisual domain. Here we investigate how adaptation to sensory asynchrony influences the perceived temporal order of audiovisual, audiotactile and visuotactile stimulus pairs. Our data show that a brief period of repeated exposure to asynchrony in any of these sensory pairings results in marked changes in subsequent temporal order judgments: the point of perceived simultaneity shifts toward the level of adaptation asynchrony. We find that the size and nature of this shift is very similar in all three pairings and that sensitivity to asynchrony is unaffected by the adaptation process. In light of these findings we suggest that a single supramodal mechanism may be responsible for the observed recalibration of multisensory perceived time.
468

Audiovisual time perception is spatially specific

Heron, James, Roach, N.W., Hanson, James Vincent Michael, McGraw, Paul V., Whitaker, David J. January 2012 (has links)
No / Our sensory systems face a daily barrage of auditory and visual signals whose arrival times form a wide range of audiovisual asynchronies. These temporal relationships constitute an important metric for the nervous system when surmising which signals originate from common external events. Internal consistency is known to be aided by sensory adaptation: repeated exposure to consistent asynchrony brings perceived arrival times closer to simultaneity. However, given the diverse nature of our audiovisual environment, functionally useful adaptation would need to be constrained to signals that were generated together. In the current study, we investigate the role of two potential constraining factors: spatial and contextual correspondence. By employing an experimental design that allows independent control of both factors, we show that observers are able to simultaneously adapt to two opposing temporal relationships, provided they are segregated in space. No such recalibration was observed when spatial segregation was replaced by contextual stimulus features (in this case, pitch and spatial frequency). These effects provide support for dedicated asynchrony mechanisms that interact with spatially selective mechanisms early in visual and auditory sensory pathways.
469

Duration channels mediate human time perception

Heron, James, Aaen-Stockdale, Craig, Hotchkiss, John, Roach, N.W., McGraw, Paul V., Whitaker, David J. January 2012 (has links)
No / The task of deciding how long sensory events seem to last is one that the human nervous system appears to perform rapidly and, for sub-second intervals, seemingly without conscious effort. That these estimates can be performed within and between multiple sensory and motor domains suggest time perception forms one of the core, fundamental processes of our perception of the world around us. Given this significance, the current paucity in our understanding of how this process operates is surprising. One candidate mechanism for duration perception posits that duration may be mediated via a system of duration-selective 'channels', which are differentially activated depending on the match between afferent duration information and the channels' 'preferred' duration. However, this model awaits experimental validation. In the current study, we use the technique of sensory adaptation, and we present data that are well described by banks of duration channels that are limited in their bandwidth, sensory-specific, and appear to operate at a relatively early stage of visual and auditory sensory processing. Our results suggest that many of the computational principles the nervous system applies to coding visual spatial and auditory spectral information are common to its processing of temporal extent.
470

Asynchrony adaptation reveals neural population code for audio-visual timing

Roach, N.W., Heron, James, Whitaker, David J., McGraw, Paul V. January 2011 (has links)
No / The relative timing of auditory and visual stimuli is a critical cue for determining whether sensory signals relate to a common source and for making inferences about causality. However, the way in which the brain represents temporal relationships remains poorly understood. Recent studies indicate that our perception of multisensory timing is flexible--adaptation to a regular inter-modal delay alters the point at which subsequent stimuli are judged to be simultaneous. Here, we measure the effect of audio-visual asynchrony adaptation on the perception of a wide range of sub-second temporal relationships. We find distinctive patterns of induced biases that are inconsistent with the previous explanations based on changes in perceptual latency. Instead, our results can be well accounted for by a neural population coding model in which: (i) relative audio-visual timing is represented by the distributed activity across a relatively small number of neurons tuned to different delays; (ii) the algorithm for reading out this population code is efficient, but subject to biases owing to under-sampling; and (iii) the effect of adaptation is to modify neuronal response gain. These results suggest that multisensory timing information is represented by a dedicated population code and that shifts in perceived simultaneity following asynchrony adaptation arise from analogous neural processes to well-known perceptual after-effects.

Page generated in 0.0236 seconds