271 |
THE KNIFE EDGE TEST AS A WAVEFRONT SENSOR (IMAGE PROCESSING).KENKNIGHT, CHARLES ELMAN. January 1987 (has links)
An algorithm to reduce data from the knife edge test is given. The method is an extension of the theory of single sideband holography to second order effects. Application to phase microscopy is especially useful because a troublesome second order term vanishes when the knife edge does not attenuate the unscattered radiation probing the specimen. The algorithm was tested by simulation of an active optics system that sensed and corrected small (less than quarter wavelength) wavefront errors. Convergence to a null was quadratic until limited by detector-injected noise in signal. The best form of the algorithm used only a Fourier transform of the smoothed detector record, a filtering of the transform, an inverse transform, and an arctangent solving for the phase of the input wavefront deformation. Iterations were helpful only for a Wiener filtering of the data record that weighted down Fourier amplitudes smaller than the mean noise level before analysis. The simplicity and sensitivity of this wavefront sensor makes it a candidate for active optic control of small-angle light scattering in space. In real time optical processing a two dimensional signal can be applied as a voltage to a deformable mirror and be received as an intensity modulation at an output plane. Combination of these features may permit a real time null test. Application to electron microscopy should allow the finding of defocus, astigmatism, and spherical aberrations for single micrographs at 0.2 nm resolution, provided a combination of specimen and support membrane is used that permits some a priori knowledge. For some thin specimens (up to nearly 100 atom layers thick) the left-right symmetry of diffraction should allow reconstruction of the wave-front deformations caused by the specimen with double the bandpass used in each image.
|
272 |
THE HOTELLING TRACE CRITERION USED FOR SYSTEM OPTIMIZATION AND FEATURE ENHANCEMENT IN NUCLEAR MEDICINE (PATTERN RECOGNITION).FIETE, ROBERT DEAN. January 1987 (has links)
The Hotelling trace criterion (HTC) is a measure of class separability used in pattern recognition to find a set of linear features that optimally separate two classes of objects. In this dissertation we use the HTC not as a figure of merit for features, but as a figure of merit for characterizing imaging systems and designing filters for feature enhancement in nuclear medicine. If the HTC is to be used to optimize systems, then it must correlate with human observer performance. In our first study, a set of images, created by overlapping ellipses, was used to simulate images of livers. Two classes were created, livers with and without tumors, with noise and blur added to each image to simulate nine different imaging systems. Using the ROC parameter dₐ as our measure, we found that the HTC has a correlation of 0.988 with the ability of humans to separate these two classes of objects. A second study was performed to demonstrate the use of the HTC for system optimization in a realistic task. For this study we used a mathematical model of normal and diseased livers and of the imaging system to generate a realistic set of liver images from nuclear medicine. A method of adaptive, nonlinear filtering which enhances the features that separate two sets of images has also been developed. The method uses the HTC to find the optimal linear feature operator for the Fourier moduli of the images, and uses this operator as a filter so that the features that separate the two classes of objects are enhanced. We demonstrate the use of this filtering method to enhance texture features in simulated liver images from nuclear medicine, after using a training set of images to obtain the filter. We also demonstrate how this method of filtering can be used to reconstruct an object from a single photon-starved image of it, when the object contains a repetitive feature. When power spectrums for real liver scans from nuclear medicine are calculated, we find that the three classifications that a physician uses, normal, patchy, and focal, can be described by the fractal dimension of the texture in the liver. This fractal dimension can be calculated even for images that suffer from much noise and blur. Given a simulated image of a liver that has been blurred and imaged with only 5000 photons, a texture with the same fractal dimension as the liver can be reconstructed.
|
273 |
DIGITAL COLOR IMAGE ENHANCEMENT BASED ON LUMINANCE & SATURATION.KIM, CHEOL-SUNG. January 1987 (has links)
This dissertation analyzes the different characteristics of color images compared to monochromatic images, combines these characteristics with monochromatic image enhancement techniques, and proposes useful color image enhancement algorithms. Luminance, hue, and saturation (L-H-S) color space is selected for color image enhancement. Color luminance is shown to play the most important role in achieving good image enhancement. Color saturation also exhibits unique features which contribute to the enhancement of high frequency details and color contrast. The local windowing method, one of the most popular image processing techniques, is rigorously analyzed for the effects of window size or weighting values on the visual appearance of an image, and the subjective enhancement afforded by local image processing techniques is explained in terms of the human vision system response. The digital color image enhancement algorithms proposed are based on the observation that the enhanced luminance image results in a good color image in L-H-S color space when the chromatic components (hue, and saturation) are kept the same. The saturation component usually contains high frequency details that are not present in the luminance component. However, processing only the saturation, while keeping the luminance and the hue unchanged, is not satisfactory because the response of human vision system presents a low pass filter to the chromatic components. To exploit high frequency details of the saturation component, we take the high frequency component of the inverse saturation image, which correlates with the luminance image, and process the luminance image proportionally to this inverse saturation image. These proposed algorithms are simple to implement. The main three application areas in image enhancement: contrast enhancement, sharpness enhancement, and noise smoothing, are discussed separately. The computer processing algorithms are restricted to those which preserve the natural appearance of the scene.
|
274 |
Design and simulation of a totally digital image system for medical image applications.Archwamety, Charnchai. January 1987 (has links)
The Totally Digital Imaging System (TDIS) is based on system requirements information from the Radiology Department, University of Arizona Health Science Center. This dissertation presents the design of this complex system, the TDIS specification, the system performance requirements, and the evaluation of the system using the computer simulation programs. Discrete event simulation models were developed for the TDIS subsystems, including an image network, imaging equipment, storage migration algorithm, data base archive system, and a control and management network. The simulation system uses empirical data generation and retrieval rates measured at the University Medical Center hospital. The entire TDIS system was simulated in Simscript II.5 using a VAX 8600 computer system. Simulation results show the fiber optical image network to be suitable, however, the optical disk storage system represents a performance bottleneck.
|
275 |
Enhancement, tracking, and analysis of digital angiograms.Hayworth, Mark Steven. January 1988 (has links)
This dissertation presents image processing methods designed to enhance images obtained by angiography, and applied image analysis methods to quantify the vascular diameter. An iterative, non-linear enhancement technique is described for enhancing the edges of blood vessels in unsubtracted angiographic images. The technique uses a median filter and the point spread function of the imaging system to increase the resolution of the image while keeping down noise. Evaluation of the images by radiologists showed that they preferred the processed images over the unprocessed images. Also described is a heuristic, recursive, vessel tracking algorithm. The tracker is intended for use with digital subtraction angiography images. The vascular system is characterized by a tree data structure. Tree structures are inherently recursive structures and thus recursive programming languages are ideally suited for building and describing them. The tracker uses a window to follow the centerlines of the vessels and stores parameters describing the vessels in nodes of a binary tree. Branching of the vascular tree is handled automatically. A least squares fit of a cylindrical model to intensity profiles of the vessel is used to estimate vessel diameter and other parameters. The tracker is able to successfully track vessels with signal-to-noise ratios down to about 4. Several criteria are applied to distinguish between vessel and noise. The relative accuracy of the diameter estimate is about 3% to 8% for a signal-to-noise ratio of 10; the absolute accuracy depends on the magnification (mm per sample). For the clinically significant case of a 25% stenosis (narrowing of the vessel), the absolute error in estimating the percent stenosis is 3.7% of the normal diameter and the relative error is 14.8%. This relative error of 14.8% is a substantial improvement over relative errors of 30% to 70% produced by other methods.
|
276 |
USE OF A PRIORI INFORMATION FOR IMPROVED TOMOGRAPHIC IMAGING IN CODED-APERTURE SYSTEMS.GINDI, GENE ROBERT. January 1982 (has links)
Coded-aperture imaging offers a method of classical tomographic imaging by encoding the distance of a point from the detector by the lateral scale of the point response function. An estimate, termed a layergram, of the transverse sections of the object can be obtained by performing a simple correlation operation on the detector data. The estimate of one transverse plane contains artifacts contributed by source points from all other planes. These artifacts can be partially removed by a nonlinear algorithm which incorporates a priori knowledge of total integrated object activity per transverse plane, positivity of the quantity being measured, and lateral extent of the object in each plane. The algorithm is iterative and contains, at each step, a linear operation followed by the imposition of a constraint. The use of this class of algorithms is tested by simulating a coded-aperture imaging situation using a one-dimensional code and two-dimensional (one axis perpendicular to aperture) object. Results show nearly perfect reconstructions in noise-free cases for the codes tested. If finite detector resolution and Poisson source noise are taken into account, the reconstructions are still significantly improved relative to the layergram. The algorithm lends itself to implementation on an optical-digital hybrid computer. The problems inherent in a prototype device are characterized and results of its performance are presented.
|
277 |
Imagery of the bilateral symmetrical optical system.Sasian Alvarado, Jose Manuel January 1988 (has links)
A brief study of the imagery of the bilateral symmetric optical system is presented. This study has been developed with a theoretical structure similar to that of the rotationally symmetric optical system and can be considered a generalization. It provides a simple, clear understanding of the main features of the imagery of the optical systems under consideration. and gives useful design insight. Some design examples are provided that illustrate the use and value of the theory developed.
|
278 |
The Design and Analysis of Computed Tomographic Imaging Spectrometers (CTIS) Using Fourier and Wavelet Crosstalk MatricesScholl, James Francis January 2010 (has links)
The properties and imaging performance of the computed tomographicimaging spectrometer (CTIS) have been investigated with Fourierand wavelet crosstalk matrices. These matrices and theircorresponding datacube reconstruction algorithms explicitly usedsensitivity equations describing the CTIS imaging system. Theseequations derived from Franhofer diffraction theory of thecomputed generated hologram (CGH) disperser, serve as themathematical model of the CTIS.The Fourier crosstalk matrix (FCTM) was primarily used to analyzethe CTIS imaging system. The FCTM describes which spatial andspectral frequencies contribute to object cube data entering thesystem and whether or not these frequencies give distinctcontributions with respect to each other. Furthermore, since theCTIS is a limited angle tomographic imaging system the missingcone of frequencies which is a feature of this instrument isclearly shown using the FCTM. Subsequently, Fourier-basedestimates of the reconstructed object cube (i.e. the datacube)will be missing this frequency information even if the CTIS is aperfect optical system.The wavelet crosstalk matrix (WCTM) was used primarily for efficient datacubereconstruction only. The datacube reconstruction calculations areprimarily proof-of-concept and reproduce the Fourier results withsome absence of Fourier related artifacts. The waveletdecomposition of the object cube is useful for studying multipleobjects in a parallel processing environment withoutreconstructing the entire datacube, thus reducing overall complexity.Datacube reconstructions of actual astronomical observations withthe CTIS, using the techniques of this research, were consistentwith previous independent datacube estimates from the same datausing existing conventional techniques. Furthermore these objectsfurnish natural point-spread functions that supplementcomputational simulations of the CTIS by describing actual imagingsystem performance.The computational tools for the study ofthe CTIS imaging system provide the additional bonus of ananalysis of object detectability by the computation of receiveroperator characteristic (ROC) curves. We used a synthetic binarystar to simulate this in the presence of both detector and objectnoise.Some suggestions for future research directions are given.
|
279 |
Design of a high speed fiber optic network interface for medical image transferByers, Daniel James, 1958- January 1987 (has links)
A high speed, 125 mega-bit per second data rate, data communication channel using fiber optic technology is described. Medical image data, generated by CT scanner or magnetic resonance imaging type imaging equipment, passes from standard American College of Radiology - National Electrical Manufactures Association (ACR-NEMA) interface equipment to the High Speed Fiber Optic Network Interface (HSFONI). The HSFONI implements the ACR-NEMA standard interface physical layer with fiber optics. The HSFONI accepts data from up to 8 devices and passes data to other devices or to a data base archive system for storage and future viewing and analysis. The fiber components, system level, and functional level considerations, and hardware circuit implementation are discussed.
|
280 |
Image quality assessment using algorithmic and machine learning techniquesLi, Cui January 2009 (has links)
The first area of work is to assess image quality by measuring the similarity between edge map of a distorted image and that of its original version using classical edge quality evaluation metrics. Experiments show that comparing edge maps of original and distorted images gives a better result than comparing the images themselves. Based on redefined source and distortion models, a novel FR image quality assessment metric DQM is proposed, which is proved by subsequent experiments to be competitive with state-of-the-art metrics (SSIM, IFC, VIF, etc.). The thesis also proposes several image quality metrics based on a framework for developing image quality assessment algorithms with the help of data-driven models (multiple linear regression, artificial neural network and support vector machine). Among them, CAM_BPNN and CAM_SVM perform better than SSIM and can even compete with its improved multi-scale version MSSIM. Apart from the research about FR image quality assessment, a novel RR image quality assessment system is proposed, based on low-level features (corner, edge and symmetry).
|
Page generated in 0.9882 seconds