• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 10
  • 4
  • 2
  • 2
  • Tagged with
  • 28
  • 13
  • 11
  • 10
  • 10
  • 10
  • 9
  • 8
  • 8
  • 7
  • 7
  • 7
  • 6
  • 5
  • 5
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
21

Cálculo rápido do operador de retroprojeção com aplicações em reconstrução tomográfica de imagens / Fast computation of the backprojection operator with applictions in tomographic image reconstruction

Lima, Camila de 09 June 2017 (has links)
Os métodos incrementais pertencem a uma classe de métodos iterativos que divide o conjunto de dados em subconjuntos ordenados, e que atualiza a imagem ao processar cada subconjunto (sub-iterações). Isso acelera a convergência das reconstruções, e imagens de qualidade são obtidas em menos iterações. No entanto, a cada sub-iteração é necessário calcular os operadores de projeção e retroprojeção, resultando no custo computacional de ordem O(n3) para a reconstrução de imagens de dimensão × . Por outro lado, algumas alternativas baseadas na interpolação em uma grade regular no espaço de Fourier ou em transformadas rápidas não-uniformes, dentre outras ideias, foram desenvolvidas a fim de aliviar esse custo computacional. Além disso, diversas abordagens foram bem sucedidas em acelerar o cálculo das iterações de algoritmos clássicos, mas nenhuma havia sido utilizada em conjunto com os métodos incrementais. Neste trabalho é proposta uma nova abordagem em que a técnica de transformada rápida de Fourier não uniforme (NFFT) é utilizada nas sub-iterações de métodos incrementais com o objetivo de efetuar de forma eficiente os cálculos numericamente mais intensos: a projeção e a retroprojeção, resultando em métodos incrementais com complexidade O(n2 log n ). Os métodos propostos são aplicados à tomografia por radiação síncrotron e os resultados da pesquisa mostram um bom desempenho. / Incremental methods belong to a class of iterative methods that divide the data set into ordered subsets, and which update the image when processing each subset (sub-iterations). It accelerates the reconstruction convergence and quality images are obtained in fewer iterations. However, it is necessary to compute the projection and backprojection operators in each sub-iteration, resulting in the computational cost of O(n3) flops for × images. On the other hand, some alternatives based on interpolation over a regular grid on the Fourier space or on nonequispaced fast transforms, among other ideas, were developed in order to alleviate the computational cost. In addition, several approaches substantially speed up the computation of the iterations of classical algorithms, but the incremental methods had not been benefited from these techniques. In this work, a new approach is proposed in which the nonequispaced fast Fourier transform (NFTT) is used in each subiteration of incremental methods in order to perform the numerically intensive calculations efficiently: the projection and backprojection, resulting in incremental methods with complexity O(n2 log n ). The proposed methods are applied to the synchrotron radiation tomography and the results show a good performance.
22

Cálculo rápido do operador de retroprojeção com aplicações em reconstrução tomográfica de imagens / Fast computation of the backprojection operator with applictions in tomographic image reconstruction

Camila de Lima 09 June 2017 (has links)
Os métodos incrementais pertencem a uma classe de métodos iterativos que divide o conjunto de dados em subconjuntos ordenados, e que atualiza a imagem ao processar cada subconjunto (sub-iterações). Isso acelera a convergência das reconstruções, e imagens de qualidade são obtidas em menos iterações. No entanto, a cada sub-iteração é necessário calcular os operadores de projeção e retroprojeção, resultando no custo computacional de ordem O(n3) para a reconstrução de imagens de dimensão × . Por outro lado, algumas alternativas baseadas na interpolação em uma grade regular no espaço de Fourier ou em transformadas rápidas não-uniformes, dentre outras ideias, foram desenvolvidas a fim de aliviar esse custo computacional. Além disso, diversas abordagens foram bem sucedidas em acelerar o cálculo das iterações de algoritmos clássicos, mas nenhuma havia sido utilizada em conjunto com os métodos incrementais. Neste trabalho é proposta uma nova abordagem em que a técnica de transformada rápida de Fourier não uniforme (NFFT) é utilizada nas sub-iterações de métodos incrementais com o objetivo de efetuar de forma eficiente os cálculos numericamente mais intensos: a projeção e a retroprojeção, resultando em métodos incrementais com complexidade O(n2 log n ). Os métodos propostos são aplicados à tomografia por radiação síncrotron e os resultados da pesquisa mostram um bom desempenho. / Incremental methods belong to a class of iterative methods that divide the data set into ordered subsets, and which update the image when processing each subset (sub-iterations). It accelerates the reconstruction convergence and quality images are obtained in fewer iterations. However, it is necessary to compute the projection and backprojection operators in each sub-iteration, resulting in the computational cost of O(n3) flops for × images. On the other hand, some alternatives based on interpolation over a regular grid on the Fourier space or on nonequispaced fast transforms, among other ideas, were developed in order to alleviate the computational cost. In addition, several approaches substantially speed up the computation of the iterations of classical algorithms, but the incremental methods had not been benefited from these techniques. In this work, a new approach is proposed in which the nonequispaced fast Fourier transform (NFTT) is used in each subiteration of incremental methods in order to perform the numerically intensive calculations efficiently: the projection and backprojection, resulting in incremental methods with complexity O(n2 log n ). The proposed methods are applied to the synchrotron radiation tomography and the results show a good performance.
23

Embebed wavelet image reconstruction in parallel computation hardware

Guevara Escobedo, Jorge January 2016 (has links)
In this thesis an algorithm is demonstrated for the reconstruction of hard-field Tomography images through localized block areas, obtained in parallel and from a multiresolution framework. Block areas are subsequently tiled to put together the full size image. Given its properties to preserve its compact support after being ramp filtered, the wavelet transform has received to date much attention as a promising solution in radiation dose reduction in medical imaging, through the reconstruction of essentially localised regions. In this work, this characteristic is exploited with the aim of reducing the time and complexity of the standard reconstruction algorithm. Independently reconstructing block images with geometry allowing to cover completely the reconstructed frame as a single output image, allows the individual blocks to be reconstructed in parallel, and to experience its performance in a multiprocessor hardware reconfigurable system (i.e. FPGA). Projection data from simulated Radon Transform (RT) was obtained at 180 evenly spaced angles. In order to define every relevant block area within the sinogram, forward RT was performed over template phantoms representing block frames. Reconstruction was then performed in a domain beyond the block frame limits, to allow calibration overlaps when fitting of adjacent block images. The 256 by 256 Shepp-Logan phantom was used to test the methodology of both parallel multiresolution and parallel block reconstruction generalisations. It is shown that the reconstruction time of a single block image in a 3-scale multiresolution framework, compared to the standard methodology, performs around 48 times faster. By assuming a parallel implementation, it can implied that the reconstruction time of a single tile, should be very close related to the reconstruction time of the full size and resolution image.
24

Multi-Aperture Coherent Change Detection and Interferometry for Synthetic Aperture Radar

Madsen, David D. 09 March 2010 (has links) (PDF)
Interferometry and coherent change detection (CCD) utilize phase differences between complex SAR images to find terrain height and to detect small changes between images, respectively. A new method for improving interferometry and CCD using multiple sub-apertures is proposed. Using backprojection processing, multiple sub-aperture images are created for a pair of flights. An interferogram and coherence map is made from each sub-aperture. For CCD, each sub-aperture coherence map offers an independent estimate of the coherence over the same area. By combining coherence maps, low coherence areas associated with residual motion errors are reduced, shadowed areas are minimized, and the overall coherence of stationary objects between images is increased. For interferometry, combining independent estimates of a scene's height offers a more accurate height estimate. For repeat-pass interferometry, multiple apertures are shown to increase the coverage of valid height estimates. The benefits of multi-aperture interferometry and CCD are shown using examples with real data.
25

Time Domain SAR Processing with GPUs for Airborne Platforms

Lagoy, Dustin 24 March 2017 (has links)
A time-domain backprojection processor for airborne synthetic aperture radar (SAR) has been developed at the University of Massachusetts’ Microwave Remote Sensing Lab (MIRSL). The aim of this work is to produce a SAR processor capable of addressing the motion compensation issues faced by frequency-domain processing algorithms, in order to create well focused SAR imagery suitable for interferometry. The time-domain backprojection algorithm inherently compensates for non-linear platform motion, dependent on the availability of accurate measurements of the motion. The implementation must manage the relatively high computational burden of the backprojection algorithm, which is done using modern graphics processing units (GPUs), programmed with NVIDIA’s CUDA language. An implementation of the Non-Equispaced Fast Fourier Transform (NERFFT) is used to enable efficient and accurate range interpolation as a critical step of the processing. The phase of time- domain processed imagery is dif erent than that of frequency-domain imagery, leading to a potentially different approach to interferometry. This general purpose SAR processor is designed to work with a novel, dual-frequency S- and Ka-band radar system developed at MIRSL as well as the UAVSAR instrument developed by NASA’s Jet Propulsion Laboratory. These instruments represent a wide range of SAR system parameters, ensuring the ability of the processor to work with most any airborne SAR. Results are presented from these two systems, showing good performance of the processor itself.
26

Image Reconstruction Based On Hilbert And Hybrid Filtered Algorithms With Inverse Distance Weight And No Backprojection Weight

Narasimhadhan, A V 08 1900 (has links) (PDF)
Filtered backprojection (FBP) reconstruction algorithms are very popular in the field of X-ray computed tomography (CT) because they give advantages in terms of the numerical accuracy and computational complexity. Ramp filter based fan-beam FBP reconstruction algorithms have the position dependent weight in the backprojection which is responsible for spatially non-uniform distribution of noise and resolution, and artifacts. Many algorithms based on shift variant filtering or spatially-invariant interpolation in the backprojection step have been developed to deal with this issue. However, these algorithms are computationally demanding. Recently, fan-beam algorithms based on Hilbert filtering with inverse distance weight and no weight in the backprojection have been derived using the Hamaker’s relation. These fan-beam reconstruction algorithms have been shown to improve noise uniformity and uniformity in resolution. In this thesis, fan-beam FBP reconstruction algorithms with inverse distance back-projection weight and no backprojection weight for 2D image reconstruction are presented and discussed for the two fan-beam scan geometries -equi-angular and equispace detector array. Based on the proposed and discussed fan-beam reconstruction algorithms with inverse distance backprojection and no backprojection weight, new 3D cone-beam FDK reconstruction algorithms with circular and helical scan trajectories for curved and planar detector geometries are proposed. To start with three rebinning formulae from literature are presented and it is shown that one can derive all fan-beam FBP reconstruction algorithms from these rebinning formulae. Specifically, two fan-beam algorithms with no backprojection weight based on Hilbert filtering for equi-space linear array detector and one new fan-beam algorithm with inverse distance backprojection weight based on hybrid filtering for both equi-angular and equi-space linear array detector are derived. Simulation results for these algorithms in terms of uniformity of noise and resolution in comparison to standard fan-beam FBP reconstruction algorithm (ramp filter based fan-beam reconstruction algorithm) are presented. It is shown through simulation that the fan-beam reconstruction algorithm with inverse distance in the backprojection gives better noise performance while retaining the resolution properities. A comparison between above mentioned reconstruction algorithms is given in terms of computational complexity. The state of the art 3D X-ray imaging systems in medicine with cone-beam (CB) circular and helical computed tomography scanners use non-exact (approximate) FBP based reconstruction algorithm. They are attractive because of their simplicity and low computational cost. However, they produce sub-optimal reconstructed images with respect to cone-beam artifacts, noise and axial intensity drop in case of circular trajectory scan imaging. Axial intensity drop in the reconstructed image is due to the insufficient data acquired by the circular-scan trajectory CB CT. This thesis deals with investigations to improve the image quality by means of the Hilbert and hybrid filtering based algorithms using redundancy data for Feldkamp, Davis and Kress (FDK) type reconstruction algorithms. In this thesis, new FDK type reconstruction algorithms for cylindrical detector and planar detector for CB circular CT are developed, which are obtained by extending to three dimensions (3D) an exact Hilbert filtering based FBP algorithm for 2D fan-beam beam algorithms with no position dependent backprojection weight and fan-beam algorithm with inverse distance backprojection weight. The proposed FDK reconstruction algorithm with inverse distance weight in the backprojection requires full-scan projection data while the FDK reconstruction algorithm with no backprojection weight can handle partial-scan data including very short-scan. The FDK reconstruction algorithms with no backprojection weight for circular CB CT are compared with Hu’s, FDK and T-FDK reconstruction algorithms in-terms of axial intensity drop and computational complexity. The simulation results of noise, CB artifacts performance and execution timing as well as the partial-scan reconstruction abilities are presented. We show that FDK reconstruction algorithms with no backprojection weight have better noise performance characteristics than the conventional FDK reconstruction algorithm where the backprojection weight is known to result in spatial non-uniformity in the noise characteristics. In this thesis, we present an efficient method to reduce the axial intensity drop in circular CB CT. The efficient method consists of two steps: the first one is reconstruction of the object using FDK reconstruction algorithm with no backprojection weight and the second is estimating the missing term. The efficient method is comparable to Zhu et al.’s method in terms of reduction in axial intensity drop, noise and computational complexity. The helical scanning trajectory satisfies the Tuy-smith condition, hence an exact and stable reconstruction is possible. However, the helical FDK reconstruction algorithm is responsible for the cone-beam artifacts since the helical FDK reconstruction algorithm is approximate in its derivation. In this thesis, helical FDK reconstruction algorithms based on Hilbert filtering with no backprojection weight and FDK reconstruction algorithm based on hybrid filtering with inverse distance backprojection weight are presented to reduce the CB artifacts. These algorithms are compared with standard helical FDK in-terms of noise, CB artifacts and computational complexity.
27

Performance Evaluation Of Fan-beam And Cone-beam Reconstruction Algorithms With No Backprojection Weight On Truncated Data Problems

Sumith, K 07 1900 (has links) (PDF)
This work focuses on using the linear prediction based projection completion for the fan-beam and cone-beam reconstruction algorithm with no backprojection weight. The truncated data problems are addressed in the computed tomography research. However, the image reconstruction from truncated data perfectly has not been achieved yet and only approximately accurate solutions have been obtained. Thus research in this area continues to strive to obtain close result to the perfect. Linear prediction techniques are adopted for truncation completion in this work, because previous research on the truncated data problems also have shown that this technique works well compared to some other techniques like polynomial fitting and iterative based methods. The Linear prediction technique is a model based technique. The autoregressive (AR) and moving average (MA) are the two important models along with autoregressive moving average (ARMA) model. The AR model is used in this work because of the simplicity it provides in calculating the prediction coefficients. The order of the model is chosen based on the partial autocorrelation function of the projection data proved in the previous researches that have been carried out in this area of interest. The truncated projection completion using linear prediction and windowed linear prediction show that reasonably accurate reconstruction is achieved. The windowed linear prediction provide better estimate of the missing data, the reason for this is mentioned in the literature and is restated for the reader’s convenience in this work. The advantages associated with the fan-beam reconstruction algorithms with no backprojection weights compared to the fan-beam reconstruction algorithm with backprojection weights motivated us to use the fan-beam reconstruction algorithm with no backprojection weight for reconstructing the truncation completed projection data. The results obtained are compared with the previous work which used conventional fan-beam reconstruction algorithms with backprojection weight. The intensity plots and the noise performance results show improvements resulting from using the fan-beam reconstruction algorithm with no backprojection weight. The work is also extended to the Feldkamp, Davis, and Kress (FDK) reconstruction algorithm with no backprojection weight for the helical scanning geometry and the results obtained are compared with the FDK reconstruction algorithm with backprojection weight for the helical scanning geometry.
28

Proton computed tomography / Tomographie proton informatisée

Quiñones, Catherine Thérèse 28 September 2016 (has links)
L'utilisation de protons dans le traitement du cancer est largement reconnue grâce au parcours fini des protons dans la matière. Pour la planification du traitement par protons, l'incertitude dans la détermination de la longueur du parcours des protons provient principalement de l'inexactitude dans la conversion des unités Hounsfield (obtenues à partir de tomographie rayons X) en pouvoir d'arrêt des protons. La tomographie proton (pCT) est une solution attrayante car cette modalité reconstruit directement la carte du pouvoir d'arrêt relatif à l'eau (RSP) de l'objet. La technique pCT classique est basée sur la mesure de la perte d'énergie des protons pour reconstruire la carte du RSP de l'objet. En plus de la perte d'énergie, les protons subissent également des diffusions coulombiennes multiples et des interactions nucléaires qui pourraient révéler d'autres propriétés intéressantes des matériaux non visibles avec les cartes de RSP. Ce travail de thèse a consisté à étudier les interactions de protons au travers de simulations Monte Carlo par le logiciel GATE et d'utiliser ces informations pour reconstruire une carte de l'objet par rétroprojection filtrée le long des chemins les plus vraisemblables des protons. Mise à part la méthode pCT conventionnelle par perte d'énergie, deux modalités de pCT ont été étudiées et mises en œuvre. La première est la pCT par atténuation qui est réalisée en utilisant l'atténuation des protons pour reconstruire le coefficient d'atténuation linéique des interactions nucléaires de l'objet. La deuxième modalité pCT est appelée pCT par diffusion qui est effectuée en mesurant la variation angulaire due à la diffusion coulombienne pour reconstruire la carte de pouvoir de diffusion, liée à la longueur de radiation du matériau. L'exactitude, la précision et la résolution spatiale des images reconstruites à partir des deux modalités de pCT ont été évaluées qualitativement et quantitativement et comparées à la pCT conventionnelle par perte d'énergie. Alors que la pCT par perte d'énergie fournit déjà les informations nécessaires pour calculer la longueur du parcours des protons pour la planification du traitement, la pCT par atténuation et par diffusion donnent des informations complémentaires sur l'objet. D'une part, les images pCT par diffusion et par atténuation fournissent une information supplémentaire intrinsèque aux matériaux de l'objet. D'autre part, dans certains des cas étudiés, les images pCT par atténuation démontrent une meilleure résolution spatiale dont l'information fournie compléterait celle de la pCT par perte d'énergie. / The use of protons in cancer treatment has been widely recognized thanks to the precise stopping range of protons in matter. In proton therapy treatment planning, the uncertainty in determining the range mainly stems from the inaccuracy in the conversion of the Hounsfield units obtained from x-ray computed tomography to proton stopping power. Proton CT (pCT) has been an attractive solution as this modality directly reconstructs the relative stopping power (RSP) map of the object. The conventional pCT technique is based on measurements of the energy loss of protons to reconstruct the RSP map of the object. In addition to energy loss, protons also undergo multiple Coulomb scattering and nuclear interactions which could reveal other interesting properties of the materials not visible with the RSP maps. This PhD work is to investigate proton interactions through Monte Carlo simulations in GATE and to use this information to reconstruct a map of the object through filtered back-projection along the most likely proton paths. Aside from the conventional energy-loss pCT, two pCT modalities have been investigated and implemented. The first one is called attenuation pCT which is carried out by using the attenuation of protons to reconstruct the linear inelastic nuclear cross-section map of the object. The second pCT modality is called scattering pCT which is performed by utilizing proton scattering by measuring the angular variance to reconstruct the relative scattering power map which is related to the radiation length of the material. The accuracy, precision and spatial resolution of the images reconstructed from the two pCT modalities were evaluated qualitatively and quantitatively and compared with the conventional energy-loss pCT. While energy-loss pCT already provides the information needed to calculate the proton range for treatment planning, attenuation pCT and scattering pCT give complementary information about the object. For one, scattering pCT and attenuation pCT images provide an additional information intrinsic to the materials in the object. Another is that, in some studied cases, attenuation pCT images demonstrate a better spatial resolution and showed features that would supplement energy-loss pCT reconstructions.

Page generated in 0.1071 seconds