• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1729
  • 1354
  • 299
  • 189
  • 90
  • 75
  • 52
  • 44
  • 29
  • 27
  • 18
  • 15
  • 15
  • 15
  • 15
  • Tagged with
  • 4629
  • 1767
  • 1295
  • 895
  • 776
  • 637
  • 629
  • 572
  • 554
  • 468
  • 468
  • 448
  • 437
  • 416
  • 415
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
321

Seasonal Cycling in Electrical Resistivities at Ten Thin Permafrost Sites, Southern Yukon and Northern British Columbia

Miceli, Christina 26 October 2012 (has links)
Permanent electrode arrays were set up at ten monitoring sites from Whitehorse, Yukon, to Fort St. John, British Columbia, in order to gain a clearer perspective of the effectiveness of electrical resistivity tomography (ERT) monitoring over an annual cycle of freezing and thawing. This research forms part of a longer-term project that is attempting to use ERT to examine changes in permafrost resulting from climate change. Inter-site and intra-site variability were examined by installing and maintaining data-loggers to monitor active layer and shallow permafrost temperatures, air temperatures, and snow depths at each site from August 2010 – August 2011. Additional site information was collected on each ERT survey date, including frost table depths, snow depths, and vegetation heights. Based on nearby community records, the climate in the region has been warming by a rate of 0.3 to 0.5 °C per decade since 1970. The permafrost at all ten sites was characteristic of sporadic discontinuous and isolated patches permafrost zones, and is classified as Ecosystem-protected. Nine of the ten permafrost sites had permafrost that was thinner than the 14 or 7 m penetration depth of the ERT survey (three-layer system consisting of an active layer, permafrost, and sub-permafrost perennially unfrozen zone). The most predictable results were achieved at the two-layer system site (active layer overlying permafrost to the base of the profile) in each of its virtual resistivity boreholes, relative resistivity change comparisons, and mean near-surface apparent resistivity progressions. ERT is an effective method of delineating permafrost boundaries in thin permafrost environments and does show strength when monitoring areas of seasonally frozen ground. Repeat surveys at a site indicate seasonal changes in three-layer conditions, but not as predictably as those in a two-layer system. In order to receive the most accurate information regarding permafrost extent and thickness, it appears ideal to conduct ERT surveys annually, within the same month as the previous year’s survey.
322

A survey of algebraic algorithms in computerized tomography

Brooks, Martin 01 August 2010 (has links)
X-ray computed tomography (CT) is a medical imaging framework. It takes measured projections of X-rays through two-dimensional cross-sections of an object from multiple angles and incorporates algorithms in building a sequence of two-dimensional reconstructions of the interior structure. This thesis comprises a review of the different types of algebraic algorithms used in X-ray CT. Using simulated test data, I evaluate the viability of algorithmic alternatives that could potentially reduce overexposure to radiation, as this is seen as a major health concern and the limiting factor in the advancement of CT [36, 34]. Most of the current evaluations in the literature [31, 39, 11] deal with low-resolution reconstructions and the results are impressive, however, modern CT applications demand very high-resolution imaging. Consequently, I selected ve of the fundamental algebraic reconstruction algorithms (ART, SART, Cimmino's Method, CAV, DROP) for extensive testing and the results are reported in this thesis. The quantitative numerical results obtained in this study, con rm the qualitative suggestion that algebraic techniques are not yet adequate for practical use. However, as algebraic techniques can actually produce an image from corrupt and/or missing data, I conclude that further re nement of algebraic techniques may ultimately lead to a breakthrough in CT. / UOIT
323

Development of a High Resolution Microvascular Imaging Toolkit for Optical Coherence Tomography

Mariampillai, Adrian 18 February 2011 (has links)
This thesis presents the development of new optical coherence tomography imaging systems and techniques to improve in vivo 3D microvascular imaging. Specifically these systems and techniques were proposed to address three main problems with 3D Doppler optical coherence tomography imaging: (a) Motion artefacts, (b) angle dependence of the signal, and (c) relatively high minimum detectable velocity of conventional color Doppler algorithms (~500 μm/s). In order to overcome these limitations a multi-pronged strategy was employed: (1) Construction of a retrospectively gated OCT system for the mitigation of periodic motion artefacts. Proof of principle in vivo B-mode imaging of Xenopus Laevis (embryo of African clawed frog) cardiovascular function up to 1000 frames per second (fps) from data acquired at 12 fps. Additionally, 4D imaging of the Xenopus Laevis heart at 45 volumes per second was demonstrated. (2) Construction of a Fourier domain mode locked laser for high speed swept source optical coherence tomography imaging. This laser was capable of reaching sweep rates of 67 kHz and was optimized to function in the SNR limited phase noise regimes upto approximately 55 dB structural SNR. (3) Development of a novel speckle variance image processing algorithm for velocity and angle independent 3D microvascular imaging. The velocity and angle independence of the technique was validated through phantom studies. iii In vivo demonstration of the speckle variance algorithm was performed by imaging the capillary network in the dorsal skin-fold window chamber model, with the results being validated using fluorescence confocal microscopy. In the final part of this thesis, these newly developed technologies were applied to the assessment of anti-vascular and anti-angiogenic therapies in preclinical models, specifically, photodynamic therapy and targeted degradation of HIF-α.
324

Studies on the Computed Tomography of the Pancreas in Patients of Liver Cirrhosis

SAKUMA, SADAYUKI, ICHIHASHI, HIDEHITO, NAKAGAWA, TAKEO, KATSUMATA, YOSHINAO, KATSUMATA, KAZUO 03 1900 (has links)
No description available.
325

Characterizing the Effects of Respiratory Motion on Pulmonary Nodule-like Objects in Computed Tomography

Hamilton, Michael 01 January 2011 (has links)
Lung nodule volumetry is used to diagnose the likelihood of malignancy in nodules detected during thoracic CT scans. These measurements are unreliable when the patient is subject to respiratory motion. We seek to understand the relationship between reconstructed images and the actual size of nodules subject to motion induced by quiet breathing. CT images of solid spheres of varying size and composition were acquired while travelling through a known path to approximate the motion of a pulmonary nodule during respiration. The measured size of the sphere’s image was found to increase non-linearly with speed. However, these relationships were dependent on the CT number of the sphere and the reconstruction filter used to generate the image. From these results we expect that for a specific CT number we can estimate the size of an object from a CT image if the speed of the object at the time of the scan is known.
326

Characterizing the Effects of Respiratory Motion on Pulmonary Nodule-like Objects in Computed Tomography

Hamilton, Michael 01 January 2011 (has links)
Lung nodule volumetry is used to diagnose the likelihood of malignancy in nodules detected during thoracic CT scans. These measurements are unreliable when the patient is subject to respiratory motion. We seek to understand the relationship between reconstructed images and the actual size of nodules subject to motion induced by quiet breathing. CT images of solid spheres of varying size and composition were acquired while travelling through a known path to approximate the motion of a pulmonary nodule during respiration. The measured size of the sphere’s image was found to increase non-linearly with speed. However, these relationships were dependent on the CT number of the sphere and the reconstruction filter used to generate the image. From these results we expect that for a specific CT number we can estimate the size of an object from a CT image if the speed of the object at the time of the scan is known.
327

Development of a High Resolution Microvascular Imaging Toolkit for Optical Coherence Tomography

Mariampillai, Adrian 18 February 2011 (has links)
This thesis presents the development of new optical coherence tomography imaging systems and techniques to improve in vivo 3D microvascular imaging. Specifically these systems and techniques were proposed to address three main problems with 3D Doppler optical coherence tomography imaging: (a) Motion artefacts, (b) angle dependence of the signal, and (c) relatively high minimum detectable velocity of conventional color Doppler algorithms (~500 μm/s). In order to overcome these limitations a multi-pronged strategy was employed: (1) Construction of a retrospectively gated OCT system for the mitigation of periodic motion artefacts. Proof of principle in vivo B-mode imaging of Xenopus Laevis (embryo of African clawed frog) cardiovascular function up to 1000 frames per second (fps) from data acquired at 12 fps. Additionally, 4D imaging of the Xenopus Laevis heart at 45 volumes per second was demonstrated. (2) Construction of a Fourier domain mode locked laser for high speed swept source optical coherence tomography imaging. This laser was capable of reaching sweep rates of 67 kHz and was optimized to function in the SNR limited phase noise regimes upto approximately 55 dB structural SNR. (3) Development of a novel speckle variance image processing algorithm for velocity and angle independent 3D microvascular imaging. The velocity and angle independence of the technique was validated through phantom studies. iii In vivo demonstration of the speckle variance algorithm was performed by imaging the capillary network in the dorsal skin-fold window chamber model, with the results being validated using fluorescence confocal microscopy. In the final part of this thesis, these newly developed technologies were applied to the assessment of anti-vascular and anti-angiogenic therapies in preclinical models, specifically, photodynamic therapy and targeted degradation of HIF-α.
328

Improved image speckle noise reduction and novel dispersion cancellation in Optical Coherence Tomography

Puvanathasan, Prabakar January 2008 (has links)
Optical coherence tomography (OCT) is an innovative modern biomedical imaging technology that allows in-vivo, non-invasive imaging of biological tissues. At present, some of the major challenges in OCT include the need for fast data acquisition system for probing fast developing biochemical processes in biological tissue, for image processing algorithms to reduce speckle noise and to remove motion artefacts, and for dispersion compensation to improve axial resolution and image contrast. To address the need for fast data acquisition, a novel, high speed (47,000 A-scans/s), ultrahigh axial resolution (3.3μm) Fourier Domain Optical Coherence Tomography (FD-OCT) system in the 1060nm wavelength region has been built at the University of Waterloo. The system provides 3.3μm image resolution in biological tissue and maximum sensitivity of 110 dB. Retinal tomograms acquired in-vivo from a human volunteer and a rat animal model show clear visualization of all intra-retinal layers and increased penetration into the choroid. OCT is based on low-coherence light interferometry. Thus, image quality is dependent on the spatial and temporal coherence properties of the optical waves back-scattered from the imaged object. Due to the coherent nature of light, OCT images are contaminated with speckle noise. Two novel speckle noise reduction algorithms based on interval type II fuzzy sets has been developed to improve the quality of the OCT images. One algorithm is a combination of anisotropic diffusion and interval type II fuzzy system while the other algorithm is based on soft thresholding wavelet coefficients using interval type II fuzzy system. Application of these novel algorithms to Cameraman test image corrupted with speckle noise (variance=0.1) resulted in a root mean square error (RMSE) of 0.07 for both fuzzy anisotropic diffusion and fuzzy wavelet algorithms. This value is less compared to the results obtained for Wiener (RMSE=0.09), adaptive Lee (RMSE=0.09), and median (RMSE=0.12) filters. Applying the algorithms to optical coherence tomograms acquired in-vivo from a human finger-tip show reduction in the speckle noise and image SNR improvement of ~13dB for fuzzy anisotropic diffusion and ~11db for fuzzy wavelet. Comparison with the Wiener (SNR improvement of ~3dB), adaptive Lee (SNR improvement of ~5dB) and median (SNR improvement of ~5dB) filters, applied to the same images, demonstrates the better performance of the fuzzy type II algorithms in terms of image metrics improvement. Micrometer scale OCT image resolution is obtained via use of broad bandwidth light sources. However, the large spectral bandwidth of the imaging beam results in broadening of the OCT interferogram because of the dispersive properties of the imaged objects. This broadening causes deterioration of the axial resolution and as well as loss of contrast in OCT images. A novel even-order dispersion cancellation interferometry via a linear, classical interferometer has been developed which can be further expanded to dispersion canceled OCT.
329

Improved image speckle noise reduction and novel dispersion cancellation in Optical Coherence Tomography

Puvanathasan, Prabakar January 2008 (has links)
Optical coherence tomography (OCT) is an innovative modern biomedical imaging technology that allows in-vivo, non-invasive imaging of biological tissues. At present, some of the major challenges in OCT include the need for fast data acquisition system for probing fast developing biochemical processes in biological tissue, for image processing algorithms to reduce speckle noise and to remove motion artefacts, and for dispersion compensation to improve axial resolution and image contrast. To address the need for fast data acquisition, a novel, high speed (47,000 A-scans/s), ultrahigh axial resolution (3.3μm) Fourier Domain Optical Coherence Tomography (FD-OCT) system in the 1060nm wavelength region has been built at the University of Waterloo. The system provides 3.3μm image resolution in biological tissue and maximum sensitivity of 110 dB. Retinal tomograms acquired in-vivo from a human volunteer and a rat animal model show clear visualization of all intra-retinal layers and increased penetration into the choroid. OCT is based on low-coherence light interferometry. Thus, image quality is dependent on the spatial and temporal coherence properties of the optical waves back-scattered from the imaged object. Due to the coherent nature of light, OCT images are contaminated with speckle noise. Two novel speckle noise reduction algorithms based on interval type II fuzzy sets has been developed to improve the quality of the OCT images. One algorithm is a combination of anisotropic diffusion and interval type II fuzzy system while the other algorithm is based on soft thresholding wavelet coefficients using interval type II fuzzy system. Application of these novel algorithms to Cameraman test image corrupted with speckle noise (variance=0.1) resulted in a root mean square error (RMSE) of 0.07 for both fuzzy anisotropic diffusion and fuzzy wavelet algorithms. This value is less compared to the results obtained for Wiener (RMSE=0.09), adaptive Lee (RMSE=0.09), and median (RMSE=0.12) filters. Applying the algorithms to optical coherence tomograms acquired in-vivo from a human finger-tip show reduction in the speckle noise and image SNR improvement of ~13dB for fuzzy anisotropic diffusion and ~11db for fuzzy wavelet. Comparison with the Wiener (SNR improvement of ~3dB), adaptive Lee (SNR improvement of ~5dB) and median (SNR improvement of ~5dB) filters, applied to the same images, demonstrates the better performance of the fuzzy type II algorithms in terms of image metrics improvement. Micrometer scale OCT image resolution is obtained via use of broad bandwidth light sources. However, the large spectral bandwidth of the imaging beam results in broadening of the OCT interferogram because of the dispersive properties of the imaged objects. This broadening causes deterioration of the axial resolution and as well as loss of contrast in OCT images. A novel even-order dispersion cancellation interferometry via a linear, classical interferometer has been developed which can be further expanded to dispersion canceled OCT.
330

Hydraulic Tomography: Field and Laboratory Experiments

Berg, Steven January 2011 (has links)
Accurately characterizing the distribution of hydraulic parameters is critical for any site investigation, particularly those dealing with solute or contaminant transport. Despite the fact that many tools are currently available for both characterizing (e.g. soil core analysis, slug and pumping tests, direct push techniques, etc.,) and modeling (e.g. geostatistical interpolators, construction of geological models, etc.,) heterogeneous aquifers, this still remains a challenge. In this thesis, hydraulic tomography (HT), a recently developed tool for characterizing and modeling heterogeneous aquifers is evaluated under both laboratory and field conditions. To date, both steady state hydraulic tomography (SSHT) and transient hydraulic tomography (THT) have been demonstrated at the laboratory scale, however, only SSHT has been rigorously validated through the prediction of independent tests (those not used for estimating the distribution of hydraulic parameters), and comparison to other characterization/modeling techniques. Additionally, laboratory and field validations of HT using comparisons other than the prediction of independent pumping tests (e.g. prediction of solute transport) are lacking. The laboratory studies performed in this thesis address some of these gaps by: i) rigorously validating THT through the prediction of independent pumping tests, and comparison to other characterization techniques; ii) using HT estimated parameter distributions to predict the migration of a conservative tracer in a heterogeneous sandbox aquifer; and, iii) predicting the flow of water to a well in a heterogeneous, unconfined, sandbox aquifer. For all three cases, HT was compared to more traditional characterization/modeling approaches, such as; the calculation of homogeneous effective parameters, kriging of point data, or the creation and calibration of a geological model. For each study the performance of HT was superior to the other characterization methods. These laboratory experiments demonstrated both the ability of HT to map aquifer heterogeneity, and the critical need for accurately understanding heterogeneity in order to make accurate predictions about a system. In this regard, HT is a powerful tool at the laboratory scale where the forcing functions (i.e., boundary conditions, flow rates, etc.,) are accurately known. While several field scale HT studies have been reported in the literature, none attempt to validate 3D THT through the prediction of independent pumping tests, or through comparison to known geology. The application of THT at the field scale presents unique challenges not faced in the laboratory setting. For example, boundary conditions are not accurately known and it is not possible to instrument a field site as densely as a sandbox aquifer. In the field studies conducted as part of this thesis, THT was validated by comparing estimated hydraulic parameter fields to known geology (borehole data) and simulating 9 pumping tests that were performed at the site. The THT analysis was able to capture the salient features of the aquifer (the presence of a double aquifer separated by an aquitard), and was able to reasonably reproduce most of the pumping tests. For comparison purposes, a homogeneous model and three additional heterogeneous models were created: i) permeameter estimates of hydraulic conductivity from soil cores were interpolated via kriging; ii) the transition probability/Markov Chain approach was used to interpret material classifications from borehole logs; and iii) a stratigraphic model was created and calibrated to pumping test data. Of these cases, THT and the calibrated stratigraphic model performed best, with THT performing slightly better. This work indicates that it is possible to interpret multiple pumping tests using hydraulic tomography to estimate the 3D distribution of hydraulic parameters in heterogeneous aquifer systems. Also, since hydraulic tomography does not require the collection and analysis of a large number of point samples, it is likely comparable in cost to other characterization/modeling approaches.

Page generated in 0.0511 seconds