Spelling suggestions: "subject:"encoding off difference"" "subject:"encoding oof difference""
1 |
Context Effects in Early Visual Processing and Eye Movement ControlNortmann, Nora 29 April 2015 (has links)
There is a difference between the raw sensory input to the brain and our stable perception of entities in the environment. A first approach to investigate perception is to study relationships between properties of currently presented stimuli and biological correlates of perceptual processes. However, it is known that such processes are not only dependent on the current stimulus. Sampling of information and the concurrent neuronal processing of stimulus content rely on contextual relationships in the environment, and between the environment and the body. Perceptual processes dynamically adjust to relevant context, such as the current task of the organism and its immediate history. To understand perception, we have to study how processing of current stimulus content is influenced by such contextual factors. This thesis investigates the influence of such factors on visual processing. In particular, it investigates effects of temporal context in early visual processing and the effect of task context in eye movement control. To investigate effects of contextual factors on early visual processing of current stimulus content, we study neuronal processing of visual information in the primary visual cortex. We use real-time optical imaging with voltage sensitive dyes to capture neuronal population activity in the millisecond range across several millimeters of cortical area. To characterize the cortical layout concerning the mapping of orientation, previous to further investigations, we use smoothly moving grating stimuli. Investigating responses to this stimulus type systematically, we find independent encoding of local contrast and orientation, and a direct mapping of current stimulus content onto cortical activity (Study 1). To investigate the influence of the previous stimulus as context on processing of current stimulus content, we use abrupt visual changes in sequences of modified natural images. In earlier studies, investigating relatively fast timescales, it was found that the primary visual cortex continuously represents current input (ongoing encoding), with little interference from past stimuli. We investigate whether this coding scheme generalizes to cases in which stimuli change more slowly, as frequently encountered in natural visual input. We use sequences of natural scene contours, comprised of vertically and horizontally filtered natural images, their superpositions, and a blank stimulus, presented with 10 or 33 Hz. We show that at the low temporal frequency, cortical activity patterns do not encode the present orientations but instead reflect their relative changes in time. For example, when a stimulus with horizontal orientation is followed by the superposition of both orientations, the pattern of cortical activity represents the newly added vertical orientations instead of the full sum of orientations. Correspondingly, contour removal from the superposition leads to the representation of orientations that have disappeared rather than those that remain. This is in sharp contrast to more rapid sequences for which we find an ongoing representation of present input, consistent with earlier studies. In summary, we find that for slow stimulus sequences, populations of neurons in the primary visual cortex are no longer tuned to orientations within individual stimuli but instead represent the difference between consecutive stimuli. Our results emphasize the influence of the temporal context on early visual processing and consequentially on information transmission to higher cortical areas (Study 2). To study effects of contextual factors on the sampling of visual information, we focus on human eye movement control. The eyes are actively moved to sample visual information from the environment. Some traditional approaches predict eye movements solely on simple stimulus properties, such as local contrasts (stimulus-driven factors). Recent arguments, however, emphasize the influence of tasks (task context) and bodily factors (spatial bias). To investigate how contextual factors affect eye movement control, we quantify the relative influences of the task context, spatial biases and stimulus-driven factors. Participants view and classify natural scenery and faces while their eye movements are recorded. The stimuli are composed of small image patches. For each of these patches we derive a measure that quantifies stimulus-driven factors, based on the image content of a patch, and spatial viewing biases, based on the location of the patch. Utilizing the participants’ classification responses, we additionally derive a measure, which reflects the information content of a patch in the context of a given task. We show that the effect of spatial biases is highest, that task context is a close runner-up, and that stimulus-driven factors have, on average, a smaller influence. Remarkably, all three factors make independent and significant contributions to the selection of viewed locations. Hence, in addition to stimulus-driven factors and spatial biases, the task context contributes to visual sampling behavior and has to be considered in a model of human eye movements.
Visual processing of current stimulus content, in particular visual sampling behavior and early processing, is inherently dependent on context. We show that already in the first cortical stage, temporal context strongly affects the processing of new visual information and that visual sampling by eye movements is significantly influenced by the task context, independently of spatial factors and stimulus-driven factors. The empirical results presented provide foundations for an improved theoretical understanding of the role of context in perceptual processes.
|
Page generated in 0.0983 seconds