Return to search

A Novel Refinement Method For Automatic Image Annotation Systems

Image annotation could be defined as the process of assigning a set of content related words to the image. An automatic image annotation system constructs the relationship between words and low level visual descriptors, which are extracted from images and by using these relationships annotates a newly seen image. The high demand on image annotation requirement increases the need to automatic image annotation systems. However, performances of current annotation methods are far from practical usage. The most common problem of current methods is the gap between semantic words and low level visual descriptors. Because of the semantic gap, annotation results of these methods contain irrelevant noisy words. To give more relevant results, refinement methods should be applied to classical image annotation outputs.

In this work, we represent a novel refinement approach for image annotation problem. The proposed system attacks the semantic gap problem by using the relationship between the words which are obtained from the dataset. Establishment of this relationship is the most crucial problem of the refinement process. In this study, we suggest a probabilistic and fuzzy approach for modelling the relationship among the words in the vocabulary, which is then employed to generate candidate annotations, based on the output of the image annotator. Candidate annotations are represented by a set of relational graphs. Finally, one of the generated candidate annotations is selected as a refined annotation result by using a clique optimization technique applied to the candidate annotation graph.

Identiferoai:union.ndltd.org:METU/oai:etd.lib.metu.edu.tr:http://etd.lib.metu.edu.tr/upload/12613346/index.pdf
Date01 June 2011
CreatorsDemircioglu, Ersan
ContributorsYarman Vural, Fatos Tunay
PublisherMETU
Source SetsMiddle East Technical Univ.
LanguageEnglish
Detected LanguageEnglish
TypeM.S. Thesis
Formattext/pdf
RightsTo liberate the content for public access

Page generated in 0.002 seconds