Return to search

Resource Allocation Using Touch And Audition

When people multi-task with inputs that demand attention, processing, andencoding, sensory interference is possible at almost any level. Multiple Resource Theory (MRT) suggests that such interference may be avoided by drawing from separate pools of resources available when using different sensory channels, memory processes, and even different response modes. Thus, there should be advantages in dividing tasks among different sensory channels to tap independent pools of attentional resources. For example, people are better with two tasks using the eye and ear, than when using two auditory or two visual inputs. The majority of the research on MRT involves visual to auditory comparisons, i.e., the prime distance senses. The unstated implication is that the theory can be easily applied to other sensory systems, such as touch, but this is untested. This overlooks the fact that each sensory system has different characteristics that can influence how information processing is allocated in a multiple-task environment. For example, vision requires a directed gaze that is not required for sound or touch. Testing MRT with touch, not only eliminates competing theories, but helps establish its robustness across the senses. Three experiments compared the senses of touch and hearing to determine if the characteristics of those sensory modalities alter the allocation of processing resources. Specifically, it was hypothesized that differences in sensory characteristics would affect performance on a simple targeting task. All three experiments used auditory shadowing as the dual task load. In the first and third experiments a target was placed to the left or right of the participant and the targeting cue (either tactile, auditory, or combined) used to locate the target originated from the side on which the target was located. The only difference between experiments 1 and 3 was that in experiment 1 the auditory targeting cue was delivered by headphones, while in experiment 3 it was delivered by speakers. Experiment 2 was more difficult both in auditory perception and in processing. In this study the targeting cues came from in front of or behind the participant. Cues coming from in front of the participant meant the target was to the left, and conversely if the cue came from behind it meant that the target was to the right. The results of experiments 1 and 3 showed that when the signals originated from the sides, there was no difference in performance between the auditory and tactile targeting cues, whether by proximal or distal stimulation. However, in experiment 2, the participants were significantly slower to locate the target when using the auditory targeting cue than when using the tactile targeting cue, with nearly twice the losses when dual-tasking. No significant differences were found on performance of the shadowing task across the three experiments. The overall findings support the hypothesis that the characteristics of the sensory system itself influence the allocation of processing resources. For example, the differences in experiment 2 are likely due to front-back reversal, a common problem found with auditory stimuli located in front of or behind, but not with tactile stimuli.

Identiferoai:union.ndltd.org:ucf.edu/oai:stars.library.ucf.edu:etd-1595
Date01 January 2005
CreatorsMortimer, David
PublisherSTARS
Source SetsUniversity of Central Florida
LanguageEnglish
Detected LanguageEnglish
Typetext
Formatapplication/pdf
SourceElectronic Theses and Dissertations

Page generated in 0.0018 seconds