Return to search

Object tracking with a pan, tilt and zoom camera

This thesis reviews the existing work done on tracking with pan/tilt/zoom cameras and proposes a novel method of multiscale block tracking with perspective cameras using principles and methods from both computer vision and computer graphics. The theory of perspective transforms in 3D graphics is reviewed as well as relating real world cameras to synthetic cameras through camera calibration. Conventional 2D block tracking methods are reviewed and the expanded to track under pan, tilt and zoom conditions. Two methods of pan/tilt/zoom tracking are developed and then applied to blocks and blobs (connected regions of pixels). A novel offline technique of fitting contours to these track points is then explored with and without the presence of pan, tilt and zoom. A method of updating the multiscale reference block is presented and the tracker is evaluated without block updating (feed forward) and with block updating and position filtering. Experimental results for people following are presented using synthetic ray-traced data with absolute ground truth and real sequences taken with a commercial pan/tilt/zoom camera with hand estimated ground truth. All sequences were processed offline, however, with optimisation it appears to be feasible to implement a real-time solution. Applying position filtering and predictions, using a Kalman filter, improves results and robustness. The application of this method to existing vision techniques is discussed as well as the possible methods for real-time implementation of these algorithms.

Identiferoai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:661970
Date January 2005
CreatorsSinclair, Mark Macdonald
PublisherUniversity of Edinburgh
Source SetsEthos UK
Detected LanguageEnglish
TypeElectronic Thesis or Dissertation
Sourcehttp://hdl.handle.net/1842/14426

Page generated in 0.0018 seconds