Return to search

The spatial averaging of disparities in brief, static random-dot stereograms

Visual images from the two eyes are transmitted to the brain. Because the eyes are horizontally separated, there is a horizontal disparity between the two images. The amount of disparity between the images of a given point depends on the distance of that point from the viewer's point of fixation. A natural visual environment contains surfaces at many different depths. Therefore, the brain must process a spatial distribution of disparities. How are these disparities spatially put together? Brief (about 200 msec) static Cyclopean random-dot stereograms were used as stimuli for vergence and depth discrimination to answer this question. The results indicated a large averaging region for vergence, and a smaller pooling region for depth discrimination. Vergence responded to the mean disparity of two transparent planes. When a disparate target was present in a fixation plane surround, vergence improved as target size was increased, with a saturation at 3-6 degrees. Depth discrimination thresholds improved with target size, reaching a minimum at 1-3 degrees, but increased for larger targets. Depth discrimination showed a dependence on the extent of a disparity pedestal surrounding the target, consistent with vergence facilitation. Vergence might, therefore, implement a coarse-to-fine reduction in binocular matching noise. Interocular decorrelation can be considered as multiple chance matches at different disparities. The spatial pooling limits found for disparity were replicated when interocular decorrelation was discriminated. The disparity of the random dots also influenced the apparent horizontal. alignment of neighbouring monocular lines. This finding suggests that disparity averaging takes place at an early stage of visual processing. The following possible explanations were considered: 1) Disparities are detected in different spatial frequency channels (Marr and Poggio, 1979). 2) Second-order luminance patterns are matched between the two eyes using non-linear channels. 3) Secondary disparity filters process disparities extracted from linear filters.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:287537
Date January 1999
CreatorsPopple, Ariella Vered
PublisherDurham University
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://etheses.dur.ac.uk/4544/

Page generated in 0.0018 seconds