Return to search

Matching affine-distorted images

Many visual tasks involve the matching of image patches derived from imaging a scene from different viewpoints. Matching two image patches can, however, be a difficult task. This is because changes in the relative orientation of a surface with respect to a camera cause deformations in the image of the surface. Thus this deformation needs to be taken into account when matching or registering two image patches of an object under changes in viewpoint. Up to first order these deformations can be described using an affine transform. Here, a computational scheme to match two image patches under an affine transform is presented. The two image patches are filtered with Gaussian and derivative of Gaussian filters. The problem of matching the two image patches is then recast as one of finding amount by which these filters must be deformed so that the filtered outputs from the two images are equal. For robustness, it is necessary to use the filter outputs from many points in a small region to obtain an overconstrained system of equations. The resulting equations are linearized with respect to the affine transforms and then iteratively solved for the affine transforms. The method is local and can match image patches in situations where other algorithms fail. It is also shown that the same framework may be used to match points and lines.

Identiferoai:union.ndltd.org:UMASS/oai:scholarworks.umass.edu:dissertations-2943
Date01 January 1997
CreatorsManmatha, Raghavan
PublisherScholarWorks@UMass Amherst
Source SetsUniversity of Massachusetts, Amherst
LanguageEnglish
Detected LanguageEnglish
Typetext
SourceDoctoral Dissertations Available from Proquest

Page generated in 0.0017 seconds