Return to search

Visual search in natural scenes with and without guidance of fixations

From the airport security guard monitoring luggage to the rushed commuter looking for their car keys, visual search is one of the most common requirements of our visual system. Despite its ubiquity, many aspects of visual search remain unaccounted for by computational models. Difficulty arises when trying to account for any internal biases of an observer undertaking a search task or trying to decompose an image of a natural scene into relevant fundamental properties. Previous studies have attempted to understand visual search by using highly simplified stimuli, such as discrete search arrays. Although these studies have been useful, the extent to which the search of discrete search arrays can represent the search of more naturalistic stimuli is subject to debate. The experiments described in this thesis used as stimuli images of natural scenes and attempted to address two key objectives. The first was to determine which image properties influenced the detectability of a target. Features investigated included chroma, entropy, contrast, edge contrast and luminance. The proportion of variance in detection ability accounted for by each feature was estimated and the features were ranked in order of importance to detection. The second objective was to develop a method for guiding human fixations by modifying image features while observers were engaged in a search task. To this end, images were modified using the image-processing method unsharp masking. To assess the effect of the image modification on fixations, eye movements were monitored using an eye-tracker. Another subject addressed in the thesis was the classification of fixations from eye movement data. There exists no standard method for achieving this classification. Existing methods have employed thresholds for speed, acceleration, duration and stability of point-of-gaze to classify fixations, but these thresholds have no commonly accepted values. Presented in this thesis is an automatic nonparametric method for classifying fixations, which extracts fixations without requiring any input parameters from the experimenter. The method was tested against independent classifications by three experts. The accurate estimation of Kullback-Leibler Divergence, an information theoretic quantity which can be used to compare probability distributions, was also addressed in this thesis since the quantity was used to compare fixation distributions. Different methods for the estimation of Kullback-Leibler divergence were tested using artificial data and it was shown than a method for estimating the quantity directly from input data outperformed methods which required binning of data or kernel density estimation to estimate underlying distributions.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:553464
Date January 2012
CreatorsMould, Matthew Simon
ContributorsFoster, David
PublisherUniversity of Manchester
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttps://www.research.manchester.ac.uk/portal/en/theses/visual-search-in-natural-scenes-with-and-without-guidance-of-fixations(00c10ea2-34fa-40b9-a304-7c1829545475).html

Page generated in 0.0024 seconds