Return to search

Image super-resolution performance of multilayer feedforward neural networks

Super-resolution is the process by which the bandwidth of a diffraction-limited spectrum is extended beyond the optical passband. Many algorithms exist which are capable of super-resolution; however most are iterative methods, which are ill-suited for real-time operation. One approach that has been virtually ignored in super-resolution research is the neural network approach. The Hopfield network has been a popular choice in image restoration applications, however it is also an iterative approach. We consider the feedforward architecture known as a Multilayer Perceptron (MLP), and present results on simulated binary and greyscale images blurred by a diffraction-limited OTF and sampled at the Nyquist rate. To avoid aliasing, the network performs as a nonlinear spatial interpolator while simultaneously extrapolating in the frequency domain. Additionally, a novel use of vector quantization for the generation of training data sets is presented. This is accomplished by training a nonlinear vector quantizer (NLIVQ), whose codebooks are subsequently used in the supervised training of the MLP network using Back-Propagation. The network shows good regularization in the presence of noise.

Identiferoai:union.ndltd.org:arizona.edu/oai:arizona.openrepository.com:10150/284549
Date January 1999
CreatorsDavila, Carlos Antonio
ContributorsHunt, Bobby R.
PublisherThe University of Arizona.
Source SetsUniversity of Arizona
Languageen_US
Detected LanguageEnglish
Typetext, Dissertation-Reproduction (electronic)
RightsCopyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author.

Page generated in 0.0028 seconds