Return to search

Fusion of images from dissimilar sensor systems

Approved for public release; distribution in unlimited. / Different sensors exploit different regions of the electromagnetic spectrum; therefore, a multi-sensor image fusion system can take full advantage of the complementary capabilities of individual sensors in the suit; to produce information that cannot be obtained by viewing the images separately. In this thesis, a framework for the multiresolution fusion of the night vision devices and thermal infrared imagery is presented. It encompasses a wavelet-based approach that supports both pixel-level and region-based fusion, and aims to maximize scene content by incorporating spectral information from both the source images. In pixel-level fusion, source images are decomposed into different scales, and salient directional features are extracted and selectively fused together by comparing the corresponding wavelet coefficients. To increase the degree of subject relevance in the fusion process, a region-based approach which uses a multiresolution segmentation algorithm to partition the image domain at different scales is proposed. The region's characteristics are then determined and used to guide the fusion process. The experimental results obtained demonstrate the feasibility of the approach. Potential applications of this development include improvements in night piloting (navigation and target discrimination), law enforcement etc. / Civilian, Republic of Singapore

Identiferoai:union.ndltd.org:nps.edu/oai:calhoun.nps.edu:10945/1279
Date12 1900
CreatorsChow, Khin Choong
ContributorsFargues, Monique P., Cooper, Alfred W., Naval Postgraduate School (U.S.)., Physics
PublisherMonterey California. Naval Postgraduate School
Source SetsNaval Postgraduate School
Detected LanguageEnglish
TypeThesis
Formatxiv, 77 p. : ill. (some col.) ;, application/pdf
RightsCopyright is reserved by the copyright owner.

Page generated in 0.0016 seconds