Mathematical models of classes of objects can significantly contribute to the analysis of digital images. A major problem in modelling is to establish suitable descriptions that cover not only a single object but also the variation that is usually present within a class of objects. The objective of this thesis is to develop more general modelling strategies than commonly used today. In particular, the impact of the human factor in the model creation process should be minimised. It is presumed that the human ability of abstraction imposes undesired constraints on the description. In comparison, common approaches are discussed from the viewpoint of generality. The technique considered introduces appearance space as a common framework to represent both shapes and images. In appearance space, an object is represented by a single point in a high-dimensional vector space. Accordingly, objects subject to variation appear as nonlinear manifolds in appearance space. These manifolds are often characterised by only a few intrinsic dimensions. A model of a class of objects is therefore considered equal to the mathematical description of this manifold. The presence of nonlinearity motivates the use of artificial auto-associative neural networks in the modelling process. The network extracts nonlinear modes of variation from a number of training examples. The procedure is evaluated on both synthetic and natural data of shapes and images and shows promising results as a general approach to object modelling.
Identifer | oai:union.ndltd.org:UPSALLA1/oai:DiVA.org:uu-4234 |
Date | January 2004 |
Creators | Wehrmann, Felix |
Publisher | Uppsala universitet, Centrum för bildanalys, Uppsala : Acta Universitatis Upsaliensis |
Source Sets | DiVA Archive at Upsalla University |
Language | English |
Detected Language | English |
Type | Doctoral thesis, monograph, info:eu-repo/semantics/doctoralThesis, text |
Format | application/pdf |
Rights | info:eu-repo/semantics/openAccess |
Relation | Uppsala Dissertations from the Faculty of Science and Technology, 1104-2516 ; 54 |
Page generated in 0.002 seconds