• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 122
  • 25
  • 4
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 1
  • 1
  • 1
  • Tagged with
  • 187
  • 187
  • 187
  • 48
  • 47
  • 47
  • 32
  • 28
  • 27
  • 27
  • 26
  • 19
  • 17
  • 16
  • 16
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
51

2D and 3D segmentation of medical images

Jones, Jonathan-Lee January 2015 (has links)
Cardiovascular disease is one of the leading causes of the morbidity and mortality in the western world today. Many different imaging modalities are in place today to diagnose and investigate cardiovascular diseases. Each of these, however, has strengths and weaknesses. There are different forms of noise and artifacts in each image modality that combine to make the field of medical image analysis both important and challenging. The aim of this thesis is develop a reliable method for segmentation of vessel structures in medical imaging, combining the expert knowledge of the user in such a way as to maintain efficiency whilst overcoming the inherent noise and artifacts present in the images. We present results from 2D segmentation techniques using different methodologies, before developing 3D techniques for segmenting vessel shape from a series of images. The main drive of the work involves the investigation of medical images obtained using catheter based techniques, namely Intra Vascular Ultrasound (IVUS) and Optical Coherence Tomography (OCT). We will present a robust segmentation paradigm, combining both edge and region information to segment the media-adventitia, and lumenal borders in those modalities respectively. By using a semi-interactive method that utilizes "soft" constraints, allowing imprecise user input which provides a balance between using the user's expert knowledge and efficiency. In the later part of the work, we develop automatic methods for segmenting the walls of lymph vessels. These methods are employed on sequential images in order to obtain data to reconstruct the vessel walls in the region of the lymph valves. We investigated methods to segment the vessel walls both individually and simultaneously, and compared the results both quantitatively and qualitatively in order obtain the most appropriate for the 3D reconstruction of the vessel wall. Lastly, we adapt the semi-interactive method used on vessels earlier into 3D to help segment out the lymph valve. This involved the user interactive method to provide guidance to help segment the boundary of the lymph vessel, then we apply a minimal surface segmentation methodology to provide segmentation of the valve.
52

Measurement of glucose metabolism using positron imaging and 18F-labeled analogs

Kearfott, Kimberlee Jane. January 1980 (has links)
Thesis: Sc. D., Massachusetts Institute of Technology, Department of Nuclear Engineering, 1980 / Bibliography: leaves 348-372. / by Kimberlee Jane Kearfott. / Sc. D. / Sc. D. Massachusetts Institute of Technology, Department of Nuclear Engineering
53

A REPROGRAMMABLE HIGH SPEED INTERFACE DESIGN FOR A PICTURE ARCHIVING AND COMMUNICATION SYSTEM

Brinks, Raymond Gerald, 1960- January 1987 (has links)
High resolution imaging devices have made a digital medical archiving system feasible. The large volumes of information generated must be stored and retrieved at high data rates in order to insure the timely diagnosis of patients. This creates some unique technological challenges that must be resolved, including the problem dealing with multiple vendor products interacting in one environment. The high speed interface card design presented in this thesis is able to deal with different computer host busses as well as different interprocessor communication protocols. The ACR-NEMA standard has been implemented in the design as one possible network protocol that provides a solution that can be easily adapted to different vendors. The design has been analyzed using the Network II.5 simulation language. The simulation was performed to insure that the original objectives are met and to determine the impact on the protocols rated throughput.
54

APPLICATION OF ACOUSTIC NUCLEAR MAGNETIC RESONANCE TO MEDICAL IMAGING

Hirsch, Thomas John, 1958- January 1986 (has links)
No description available.
55

An Investigation into EPID Flood Fields Independent from the Linear Accelerator Beam

Satory, Philip Reynard January 2008 (has links)
The EPID (electronic portal imaging device) was designed for in vivo imaging of patients during radiotherapy treatment. The ability of EPIDs to promptly acquire two dimensional data, lends them to be considered for use in quality assurance of the linac. This thesis set out to investigate the possibility of using a radionuclide, technetium 99 m (Tc99m), to produce a flood field for the calibration of an EPID, because using a beam calibrated EPID to measure the beam is self-referential. The difference in relative response between the energy spectrum of a 6MV beam and the Tc99m was investigated using EGSNRC DoseXYZ Monte Carlo Modelling. The relative output ratio was calculated to be less than 1.6%. The dose response of the EPID with respect to dose rate was checked using different activities of Tc99m and found to be linear. The flatness from a phantom was calculated, with a model in MATLAB, for a range of heights, overlaps, thickness, and deformations, to find the optimum balances between signal strength and flatness. This model was checked for accuracy using diagnostic radiographic film. The culmination of the energy response, linearity and the calculated flatness is a flood field taken with a flood phantom on the EPID with low signal strength. To get a signal to noise ratio of 3% the mean of over 2000 flood field images were used. This accuracy was not adequate for clinical use but the averaging of pixels it is accurate enough for QA.
56

THE HOTELLING TRACE CRITERION USED FOR SYSTEM OPTIMIZATION AND FEATURE ENHANCEMENT IN NUCLEAR MEDICINE (PATTERN RECOGNITION).

FIETE, ROBERT DEAN. January 1987 (has links)
The Hotelling trace criterion (HTC) is a measure of class separability used in pattern recognition to find a set of linear features that optimally separate two classes of objects. In this dissertation we use the HTC not as a figure of merit for features, but as a figure of merit for characterizing imaging systems and designing filters for feature enhancement in nuclear medicine. If the HTC is to be used to optimize systems, then it must correlate with human observer performance. In our first study, a set of images, created by overlapping ellipses, was used to simulate images of livers. Two classes were created, livers with and without tumors, with noise and blur added to each image to simulate nine different imaging systems. Using the ROC parameter dₐ as our measure, we found that the HTC has a correlation of 0.988 with the ability of humans to separate these two classes of objects. A second study was performed to demonstrate the use of the HTC for system optimization in a realistic task. For this study we used a mathematical model of normal and diseased livers and of the imaging system to generate a realistic set of liver images from nuclear medicine. A method of adaptive, nonlinear filtering which enhances the features that separate two sets of images has also been developed. The method uses the HTC to find the optimal linear feature operator for the Fourier moduli of the images, and uses this operator as a filter so that the features that separate the two classes of objects are enhanced. We demonstrate the use of this filtering method to enhance texture features in simulated liver images from nuclear medicine, after using a training set of images to obtain the filter. We also demonstrate how this method of filtering can be used to reconstruct an object from a single photon-starved image of it, when the object contains a repetitive feature. When power spectrums for real liver scans from nuclear medicine are calculated, we find that the three classifications that a physician uses, normal, patchy, and focal, can be described by the fractal dimension of the texture in the liver. This fractal dimension can be calculated even for images that suffer from much noise and blur. Given a simulated image of a liver that has been blurred and imaged with only 5000 photons, a texture with the same fractal dimension as the liver can be reconstructed.
57

Design and simulation of a totally digital image system for medical image applications.

Archwamety, Charnchai. January 1987 (has links)
The Totally Digital Imaging System (TDIS) is based on system requirements information from the Radiology Department, University of Arizona Health Science Center. This dissertation presents the design of this complex system, the TDIS specification, the system performance requirements, and the evaluation of the system using the computer simulation programs. Discrete event simulation models were developed for the TDIS subsystems, including an image network, imaging equipment, storage migration algorithm, data base archive system, and a control and management network. The simulation system uses empirical data generation and retrieval rates measured at the University Medical Center hospital. The entire TDIS system was simulated in Simscript II.5 using a VAX 8600 computer system. Simulation results show the fiber optical image network to be suitable, however, the optical disk storage system represents a performance bottleneck.
58

Enhancement, tracking, and analysis of digital angiograms.

Hayworth, Mark Steven. January 1988 (has links)
This dissertation presents image processing methods designed to enhance images obtained by angiography, and applied image analysis methods to quantify the vascular diameter. An iterative, non-linear enhancement technique is described for enhancing the edges of blood vessels in unsubtracted angiographic images. The technique uses a median filter and the point spread function of the imaging system to increase the resolution of the image while keeping down noise. Evaluation of the images by radiologists showed that they preferred the processed images over the unprocessed images. Also described is a heuristic, recursive, vessel tracking algorithm. The tracker is intended for use with digital subtraction angiography images. The vascular system is characterized by a tree data structure. Tree structures are inherently recursive structures and thus recursive programming languages are ideally suited for building and describing them. The tracker uses a window to follow the centerlines of the vessels and stores parameters describing the vessels in nodes of a binary tree. Branching of the vascular tree is handled automatically. A least squares fit of a cylindrical model to intensity profiles of the vessel is used to estimate vessel diameter and other parameters. The tracker is able to successfully track vessels with signal-to-noise ratios down to about 4. Several criteria are applied to distinguish between vessel and noise. The relative accuracy of the diameter estimate is about 3% to 8% for a signal-to-noise ratio of 10; the absolute accuracy depends on the magnification (mm per sample). For the clinically significant case of a 25% stenosis (narrowing of the vessel), the absolute error in estimating the percent stenosis is 3.7% of the normal diameter and the relative error is 14.8%. This relative error of 14.8% is a substantial improvement over relative errors of 30% to 70% produced by other methods.
59

USE OF A PRIORI INFORMATION FOR IMPROVED TOMOGRAPHIC IMAGING IN CODED-APERTURE SYSTEMS.

GINDI, GENE ROBERT. January 1982 (has links)
Coded-aperture imaging offers a method of classical tomographic imaging by encoding the distance of a point from the detector by the lateral scale of the point response function. An estimate, termed a layergram, of the transverse sections of the object can be obtained by performing a simple correlation operation on the detector data. The estimate of one transverse plane contains artifacts contributed by source points from all other planes. These artifacts can be partially removed by a nonlinear algorithm which incorporates a priori knowledge of total integrated object activity per transverse plane, positivity of the quantity being measured, and lateral extent of the object in each plane. The algorithm is iterative and contains, at each step, a linear operation followed by the imposition of a constraint. The use of this class of algorithms is tested by simulating a coded-aperture imaging situation using a one-dimensional code and two-dimensional (one axis perpendicular to aperture) object. Results show nearly perfect reconstructions in noise-free cases for the codes tested. If finite detector resolution and Poisson source noise are taken into account, the reconstructions are still significantly improved relative to the layergram. The algorithm lends itself to implementation on an optical-digital hybrid computer. The problems inherent in a prototype device are characterized and results of its performance are presented.
60

Design of a high speed fiber optic network interface for medical image transfer

Byers, Daniel James, 1958- January 1987 (has links)
A high speed, 125 mega-bit per second data rate, data communication channel using fiber optic technology is described. Medical image data, generated by CT scanner or magnetic resonance imaging type imaging equipment, passes from standard American College of Radiology - National Electrical Manufactures Association (ACR-NEMA) interface equipment to the High Speed Fiber Optic Network Interface (HSFONI). The HSFONI implements the ACR-NEMA standard interface physical layer with fiber optics. The HSFONI accepts data from up to 8 devices and passes data to other devices or to a data base archive system for storage and future viewing and analysis. The fiber components, system level, and functional level considerations, and hardware circuit implementation are discussed.

Page generated in 0.1672 seconds