Spelling suggestions: "subject:"lack projection"" "subject:"lack introjection""
31 |
Enhanced SAR Image Processing Using A Heterogeneous MultiprocessorSHI, YU January 2008 (has links)
Synthetic antenna aperture (SAR) is a pulses focusing airborne radar which can achieve high resolution radar image. A number of image process algorithms have been developed for this kind of radar, but the calculation burden is still heavy. So the image processing of SAR is normally performed “off-line”. The Fast Factorized Back Projection (FFBP) algorithm is considered as a computationally efficient algorithm for image formation in SAR, and several applications have been implemented which try to make the process “on-line”. CELL Broadband Engine is one of the newest multi-core-processor jointly developed by Sony, Toshiba and IBM. CELL is good at parallel computation and floating point numbers, which all fit the demands of SAR image formation. This thesis is going to implement FFBP algorithm on CELL Broadband Engine, and compare the results with pre-projects. In this project, we try to make it possible to perform SAR image formation in real-time.
|
32 |
Reconstruction de la dose absorbée in vivo en 3D pour les traitements RCMI et arcthérapie à l'aide des images EPID de transit / 3D in vivo absorbed dose reconstruction for IMRT and arc therapy treatments with epid transit imagesYounan, Fouad 13 December 2018 (has links)
Cette thèse a été réalisée dans le cadre de la dosimétrie des faisceaux de haute énergie délivrés au patient pendant un traitement de radiothérapie externe. L'objectif de ce travail est de vérifier que la distribution de dose 3D absorbée dans le patient est conforme au calcul réalisé sur le système de planification de traitement (TPS) à partir de l'imageur portal (en anglais : Electronic Portal Imaging Device, EPID). L'acquisition est réalisée en mode continu avec le détecteur aS-1200 au silicium amorphe embarqué sur la machine TrueBeam STx (VARIAN Medical system, Palo Alto, USA). Les faisceaux ont une énergie de 10 MeV et un débit de 600 UM.min-1. La distance source-détecteur (DSD) est de 150 cm. Après correction des pixels défectueux, une étape d'étalonnage permet de convertir leur signal en dose absorbée dans l'eau via une fonction de réponse. Des kernels de correction sont également utilisés pour prendre en compte la différence de matériaux entre l'EPID et l'eau et pour corriger la pénombre sur les profils de dose. Un premier modèle de calcul a permis ensuite de rétroprojeter la dose portale en milieu homogène en prenant en compte plusieurs phénomènes : les photons diffusés provenant du fantôme et rajoutant un excès de signal sur les images, l'atténuation des faisceaux, la diffusion dans le fantôme, l'effet de build-up et l'effet de durcissement du faisceau avec la profondeur. La dose reconstruite est comparée à celle calculée par le TPS avec une analyse gamma globale (3% du maximum de dose et 3 mm de DTA). L'algorithme a été testé sur un fantôme cylindrique homogène et sur un fantôme de pelvis à partir de champs modulés en intensité (RCMI) et à partir de champs d'arcthérapie volumique modulés, VMAT selon l'acronyme anglais Volumetric Modulated Arc Therapy. Le modèle a ensuite été affiné pour prendre en compte les hétérogénéités traversées dans le milieu au moyen des distances équivalentes eau dans une nouvelle approche de dosimétrie plus connue sous le terme de " in aqua vivo " (1). Il a été testé sur un fantôme thorax et, in vivo sur 10 patients traités pour une tumeur de la prostate à partir de champs VMAT. Pour finir, le modèle in aqua a été testé sur le fantôme thorax avant et après y avoir appliqué certaines modifications afin d'évaluer la possibilité de détection de sources d'erreurs pouvant influencer la bonne délivrance de la dose au patient.[...] / This thesis aims at the dosimetry of high energy photon beams delivered to the patient during an external radiation therapy treatment. The objective of this work is to use EPID the Electronic Portal Imaging Device (EPID) in order to verify that the 3D absorbed dose distribution in the patient is consistent with the calculation performed on the Treatment Planning System (TPS). The acquisition is carried out in continuous mode with the aS-1200 amorphous silicon detector embedded on the TrueBeam STx machine (VARIAN Medical system, Palo Alto, USA) for 10MV photons with a 600 UM.min-1 dose rate. The source-detector distance (SDD) is 150 cm. After correction of the defective pixels, a calibration step is performed to convert the signal into an absorbed dose in water via a response function. Correction kernels are also used to take into account the difference in materials between EPID and water and to correct penumbra. A first model of backprojection was performed to reconstruct the absorbed dose distribution in a homogeneous medium by taking into account several phenomena: the scattered photons coming from the phantom to the EPID, the attenuation of the beams, the diffusion into the phantom, the build-up, and the effect of beam hardening with depth. The reconstructed dose is compared to the one calculated by the TPS with global gamma analysis (3% as the maximum dose difference criteria and 3mm as the distance to agreement criteria). The algorithm was tested on a homogeneous cylindrical phantom and a pelvis phantom for Intensity-Modulated Radiation Therapy (IMRT) and (Volumetric Arc Therapy (VMAT) technics. The model was then refined to take into account the heterogeneities in the medium by using radiological distances in a new dosimetrical approach better known as "in aqua vivo" (1). It has been tested on a thorax phantom and, in vivo on 10 patients treated for a prostate tumor from VMAT fields. Finally, the in aqua model was tested on the thorax phantom before and after making some modifications to evaluate the possibility of detecting errors that could affect the correct delivery of the dose to the patient. [...]
|
33 |
Digitální metody zpracování trojrozměrného zobrazení v rentgenové tomografii a holografické mikroskopii / The Three-Dimensional Digital Imaging Methods for X-ray Computed Tomography and Digital Holographic MicroscopyKvasnica, Lukáš January 2015 (has links)
This dissertation thesis deals with the methods for processing image data in X-ray microtomography and digital holographic microscopy. The work aims to achieve significant acceleration of algorithms for tomographic reconstruction and image reconstruction in holographic microscopy by means of optimization and the use of massively parallel GPU. In the field of microtomography, the new GPU (graphic processing unit) accelerated implementations of filtered back projection and back projection filtration of derived data are presented. Another presented algorithm is the orientation normalization technique and evaluation of 3D tomographic data. In the part related to holographic microscopy, the individual steps of the complete image processing procedure are described. This part introduces the new orignal technique of phase unwrapping and correction of image phase damaged by the occurrence of optical vortices in the wrapped image phase. The implementation of the methods for the compensation of the phase deformation and for tracking of cells is then described. In conclusion, there is briefly introduced the Q-PHASE software, which is the complete bundle of all the algorithms necessary for the holographic microscope control, and holographic image processing.
|
34 |
Physics-Based Near-Field Microwave Imaging Algorithms for Dense Layered MediaRen, Kai January 2017 (has links)
No description available.
|
35 |
Algoritmo de reconstrucción analítico para el escáner basado en cristales monolíticos MINDViewSánchez Góez, Sebastián 17 January 2021 (has links)
[ES] La tomografía por emisión de positrones (PET, del inglés Positron Emission Tomography) es una técnica de medicina nuclear en la que se genera una imagen a partir de la detección de rayos gamma en coincidencia. Estos rayos son producidos dentro de un paciente al que se le inyecta una radiotrazador emisor de positrones, los cuales se aniquilan con electrones del medio circundante. El proceso de adquisición de eventos de interacción, tiene como unidad central el detector del escáner PET, el cual se compone a su vez de un cristal de centelleo, encargado de transformar los rayos gamma incidentes en fotones ópticos dentro del cristal. La finalidad es entonces, determinar las coordenadas de impacto dentro del cristal de centelleo con la mayor precisión posible, para que, a partir de dichos puntos, se pueda reconstruir una imagen.
A lo largo de la historia, los detectores basados en cristales pixelados han representado la elección por excelencia para la la fabricación de escáneres PET. En está tesis se evalúa el impacto en la resolución espacial del escáner PET MINDView, desarrollado dentro del séptimo programa Marco de la Unión Europea No 603002, el cual se basa en el uso de cristales monolíticos. El uso de cristales monolíticos, facilita la determinación de la profundidad de interacción (DOI - del inglés Depth Of Interaction) de los rayos gamma incidentes, aumenta la precisión en las coordenadas de impacto determinadas, y disminuye el error de paralaje que se induce en cristales pixelados, debido a la dificultad para determinar la DOI.
En esta tesis, hemos logrado dos objetivos principales relacionados con la medición de la resolución espacial del escáner MINDView: la adaptación del un algoritmo de STIR de Retroproyección Filtrada en 3D (FBP3DRP - del inglés Filtered BackProjection 3D Reproyected) a un escáner basado en cristales monolíticos y la implementación de un algoritmo de Retroproyección y filtrado a posteriori (BPF - BackProjection then Filtered). Respecto a la adaptación del algoritmo FBP, las resoluciones espaciales obtenidas varían en los intervalos [2 mm, 3,4 mm], [2,3 mm, 3,3 mm] y [2,2 mm, 2,3 mm] para las direcciones radial, tangencial y axial, respectivamente, en el primer prototipo del escáner MINDView dedicado a cerebro. Por otra parte, en la implementación del algoritmo de tipo BPF, se realizó una adquisición de un maniquí de derenzo y se comparó la resolución obtenida con el algoritmo de FBP y una implementación del algoritmo de subconjuntos ordenados en modo lista (LMOS - del inglés List Mode Ordered Subset). Mediante el algoritmo de tipo BPF se obtuvieron valores pico-valle de 2.4 a lo largo de los cilindros del maniquí de 1.6 mm de diámetro, en contraste con las medidas obtenidas de 1.34 y 1.44 para los algoritmos de FBP3DRP y LMOS, respectivamente. Lo anterior se traduce en que, mediante el algoritmo de tipo BPF, se logra mejorar la resolución para obtenerse un valor promedio 1.6 mm. / [CAT] La tomografia per emissió de positrons és una tècnica de medicina nuclear en la qual es genera una imatge a partir de la detecció de raigs gamma en coincidència. Aquests raigs són produïts dins d'un pacient a què se li injecta una radiotraçador emissor de positrons, els quals s'aniquilen amb electrons de l'medi circumdant. El procés de adquición d'esdeveniments d'interacció, té com a unitat central el detector de l'escàner PET, el qual es compon al seu torn d'un vidre de centelleig, encarregat de transformar els raigs gamma incidents en fotons òptics dins el vidre. La finalitat és llavors, determinar les coordenades d'impacte dins el vidre de centelleig amb la major precisió possible, perquè, a partir d'aquests punts, es pugui reconstruir una imatge.
Al llarg de la història, els detectors basats en cristalls pixelats han representat l'elecció per excellència per a la la fabricació d'escàners PET. En aquesta tesi s'avalua l'impacte en la resolució espacial de l'escàner PET MINDView, desenvolupat dins el setè programa Marc de la Unió Europea No 603.002, el qual es basa en l'ús de vidres monolítics. L'ús de vidres monolítics, facilita la determinació de la profunditat d'interacció dels raigs gamma incidents, augmenta la precisió en les coordenades d'impacte determinades, i disminueix l'error de parallaxi que s'indueix en cristalls pixelats, a causa de la dificultat per determinar la DOI.
En aquesta tesi, hem aconseguit dos objectius principals relacionats amb el mesurament de la resolució espacial de l'escàner MINDView: l'adaptació de l'un algoritme de STIR de Retroprojecció Filtrada en 3D a un escàner basat en cristalls monolítics i la implementació d'un algoritme de Retroprojecció i filtrat a posteriori. Pel que fa a l'adaptació de l'algoritme FBP3DRP, les resolucions espacials obtingudes varien en els intervals [2 mm, 3,4 mm], [2,3 mm, 3,3 mm] i [2,2 mm, 2,3 mm] per les direccions radial, tangencial i axial, respectivament, en el primer prototip de l'escàner MINDView dedicat a cervell. D'altra banda, en la implementació de l'algoritme de tipus BPF, es va realitzar una adquisició d'un maniquí de derenzo i es va comparar la resolució obtinguda amb l'algorisme de FBP3DRP i una implementació de l'algoritme de subconjunts ordenats en mode llista (LMOS - de l'anglès List Mode Ordered Subset). Mitjançant l'algoritme de tipus BPF es van obtenir valors pic-vall de 2.4 al llarg dels cilindres de l'maniquí de 1.6 mm de diàmetre, en contrast amb les mesures obtingudes de 1.34 i 1.44 per als algoritmes de FBP3DRP i LMOS, respectivament. L'anterior es tradueix en que, mitjançant l'algoritme de tipus BPF, s'aconsegueix millorar la resolució per obtenir-se un valor mitjà 1.6 mm. / [EN] Positron Emission Tomography (PET) is a medical imaging technique, in which an image is generated from the detection of gamma rays in coincidence. These rays are produced within a patient, who is injected with a positron emmiter radiotracer, from which positrons are annihilated with electrons in the media. The event acquisition process is focused on the scanner detector. The detector is in turn composed of a scintillation crystal, which transform the incident ray gamma into optical photons within the crystal. The purpose is then to determine the impact coordinates within the scintillation crystal with the greatest possible precision, so that, from these points, an image can be reconstructed.
Throughout history, detectors based on pixelated crystals have represented the quintessential choice for PET scanners manufacture. This thesis evaluates the impact on the spatial resolution of the MINDView PET scanner, developed in the seventh Framework program of the European Union No. 603002, which detectors are based on monolithic crystals. The use of monolithic crystals facilitates the determination of the depth of interaction (DOI - Depth Of Interaction) of the incident gamma rays, increases the precision in the determined impact coordinates, and reduces the parallax error induces in pixelated crystals, due to the difficulties in determining DOI.
In this thesis, we have achieved two main goals related to the measurement of the spatial resolution of the MINDView PET scanner: the adaptation of an STIR algorithm for Filtered BackProjection 3D Reproyected (FBP3DRP) to a scanner based on monolithic crystals, and the implementation of a BackProjection then Filtered algorithm (BPF). Regarding the FBP algorithm adaptation, we achieved resolutions ranging in the intervals [2 mm, 3.4 mm], [2.3 mm, 3.3 mm] and [2.2 mm, 2.3 mm] for the radial, tangential and axial directions, respectively. On the an acquisition of a derenzo phantom was performed to measure the spacial resolution, which was obtained using three reconstruction algorithms: the BPF-type algorithm, the FBP3DRP algorithm and an implementation of the list-mode ordered subsets algorithm (LMOS). Regarding the BPF-type algorithm, a peak-to-valley value of 2.4 were obtain along rod of 1.6 mm, in contrast to the measurements of 1.34 and 1.44 obtained for the FBP3DRP and LMOS algorithms, respectively. This means that, by means of the BPF-type algorithm, it is possible to improve the resolution to obtain an average value of 1.6 mm. / Sánchez Góez, S. (2020). Algoritmo de reconstrucción analítico para el escáner basado en cristales monolíticos MINDView [Tesis doctoral]. Universitat Politècnica de València. https://doi.org/10.4995/Thesis/10251/159259
|
36 |
Super resolução baseada em métodos iterativos de restauraçãoCastro, Márcia Luciana Aguena 24 June 2013 (has links)
Made available in DSpace on 2016-06-02T19:03:57Z (GMT). No. of bitstreams: 1
5415.pdf: 8638421 bytes, checksum: 0e5c5abf95c786434202fdae3e69dc1e (MD5)
Previous issue date: 2013-06-24 / Financiadora de Estudos e Projetos / The resolution enhancement of an image is always desirable, independently of its objective, but mainly if the image has the purpose of visual analysis. The hardware development for increasing the image resolution still has a higher cost than the algorithmic solutions for super-resolution. Like image restoration, super-resolution is also an ill-conditioned inverse problem, and has an infinite number of solutions. This work analyzes the iterative restoration methods (Van Cittert, Tikhonov-Miller and Conjugate Gradiente) which propose solutions for the ill-conditioning problem and compares them with the IBP method (Iterative Back Projection). The analysis of the found similarities is the basis of a generalization, such that other iterative restoration methods can have their properties adapted, as regularization of the ill-conditioning, noise reduction and other degradations and the increase of the convergence rate can be incorporated to the techniques of super-resolution. Two new methods were created as case studies of the proposed generalization: the first one is a super-resolution method for dynamic magnetic resonance imaging (MRI) of the swallowing process, that uses an adaptiveWiener filtering as regularization and a non-rigid registration; and the second one is a pan sharpening method of SPOT satellite bands, that uses sampling based on sensor s characteristics and non-adaptive Wiener filtering. / A melhora da resolução de uma imagem é sempre desejada, independentemente de seu objetivo, mas principalmente se destinada a análise visual. O desenvolvimento de hardware para o aumento de resolução de uma imagem em sua captura ainda possui o custo mais elevado do que as soluções algorítmicas de super resolução (SR). Assim como a restauração de imagens, a super resolução também é um problema inverso mal-condicionado e possui infinitas soluções. Este trabalho analisa métodos de restauração iterativos (Van Cittert, Tikhonov-Miller e Gradiente Conjugado) que proponham soluções para o problema do malcondicionamento e os compara com o método IBP (Iterative Back-Projection). A análise das semelhanças encontradas é base para uma generalização de modo que outros métodos iterativos de restauração possam ter suas propriedades adaptadas, tais como regularização do mal-condicionamento, redução do ruído e outras degradações e aumento na taxa de convergência, para que possam ser incorporadas à técnicas de super resolução. Dois novos métodos foram criados como estudo de caso da generalização proposta: o primeiro é um método de super-resolução para imageamento por ressonância magnética (MRI) dinâmico do processo de deglutição, que utiliza uma filtragem de Wiener adaptativa como regularização e registro não-rígido; o segundo é um método de pansharpening das bandas do satélite SPOT, que utiliza amostragem baseada nas características do sensor e filtragem de Wiener não-adaptativa.
|
37 |
Automated Selection of Hyper-Parameters in Diffuse Optical Tomographic Image ReconstructionJayaprakash, * January 2013 (has links) (PDF)
Diffuse optical tomography is a promising imaging modality that provides functional information of the soft biological tissues, with prime imaging applications including breast and brain tissue in-vivo. This modality uses near infrared light( 600nm-900nm) as the probing media, giving an advantage of being non-ionizing imaging modality.
The image reconstruction problem in diffuse optical tomography is typically posed as a least-squares problem that minimizes the difference between experimental and modeled data with respect to optical properties. This problem is non-linear and ill-posed, due to multiple scattering of the near infrared light in the biological tissues, leading to infinitely many possible solutions. The traditional methods employ a regularization term to constrain the solution space as well as stabilize the solution, with Tikhonov type regularization being the most popular one. The choice of this regularization parameter, also known as hyper parameter, dictates the reconstructed optical image quality and is typically chosen empirically or based on prior experience.
In this thesis, a simple back projection type image reconstruction algorithm is taken up, as they are known to provide computationally efficient solution compared to regularized solutions. In these algorithms, the hyper parameter becomes equivalent to filter factor and choice of which is typically dependent on the sampling interval used for acquiring data in each projection and the angle of projection. Determining these parameters for diffuse optical tomography is not so straightforward and requires usage of advanced computational models. In this thesis, a computationally efficient simplex
Method based optimization scheme for automatically finding this filter factor is proposed and its performances is evaluated through numerical and experimental phantom data. As back projection type algorithms are approximations to traditional methods, the absolute quantitative accuracy of the reconstructed optical properties is poor .In scenarios, like dynamic imaging, where the emphasis is on recovering relative difference in the optical properties, these algorithms are effective in comparison to traditional methods, with an added advantage being highly computationally efficient.
In the second part of this thesis, this hyper parameter choice for traditional Tikhonov type regularization is attempted with the help of Least-Squares QR-decompisition (LSQR) method. The established techniques that enable the automated choice of hyper parameters include Generalized Cross-Validation(GCV) and regularized Minimal Residual Method(MRM), where both of them come with higher over head of computation time, making it prohibitive to be used in the real-time. The proposed LSQR algorithm uses bidiagonalization of the system matrix to result in less computational cost. The proposed LSQR-based algorithm for automated choice of hyper parameter is compared with MRM methods and is proven to be computationally optimal technique through numerical and experimental phantom cases.
|
38 |
An Optimized Fixed-Point Synthetic Aperture Radar Back Projection Algorithm Implemented on a Field-Programmable Gate ArrayHettiarachchi, Don Lahiru Nirmal Manikka January 2021 (has links)
No description available.
|
39 |
Development of a novel sensor for soot deposition measurement in a diesel particulate filter using electrical capacitance tomographyHuq, Ragibul January 2014 (has links)
Indiana University-Purdue University Indianapolis (IUPUI) / This paper presents a novel approach of particulate material (soot) measurement in a Diesel particulate filter using Electrical Capacitance Tomography. Modern Diesel Engines are equipped with Diesel Particulate Filters (DPF), as well as on-board technologies to evaluate the status of DPF because complete knowledge of DPF soot loading is very critical for robust efficient operation of the engine exhaust after treatment system. Emission regulations imposed upon all internal combustion engines including Diesel engines on gaseous as well as particulates (soot) emissions by Environment Regulatory Agencies. In course of time, soot will be deposited inside the DPFs which tend to clog the filter and hence generate a back pressure in the exhaust system, negatively impacting the fuel efficiency. To remove the soot build-up, regeneration of the DPF must be done as an engine exhaust after treatment process at pre-determined time intervals. Passive regeneration use exhaust heat and catalyst to burn the deposited soot but active regeneration use external energy in such as injection of diesel into an upstream DOC to burn the soot. Since the regeneration process consume fuel, a robust and efficient operation based on accurate knowledge of the particulate matter deposit (or soot load)becomes essential in order to keep the fuel consumption at a minimum. In this paper, we propose a sensing method for a DPF that can accurately measure in-situ soot load using Electrical Capacitance Tomography (ECT). Simulation results show that the proposed method offers an effective way to accurately estimate the soot load in DPF. The proposed method is expected to have a profound impact in improving overall PM filtering efficiency (and thereby fuel efficiency), and durability of a Diesel Particulate Filter (DPF) through appropriate closed loop regeneration operation.
|
40 |
Vision Beyond Optics: Standardization, Evaluation and Innovation for Fluorescence Microscopy in Life SciencesHuisman, Maximiliaan 01 April 2019 (has links)
Fluorescence microscopy is an essential tool in biomedical sciences that allows specific molecules to be visualized in the complex and crowded environment of cells. The continuous introduction of new imaging techniques makes microscopes more powerful and versatile, but there is more than meets the eye. In addition to develop- ing new methods, we can work towards getting the most out of existing data and technologies. By harnessing unused potential, this work aims to increase the richness, reliability, and power of fluorescence microscopy data in three key ways: through standardization, evaluation and innovation.
A universal standard makes it easier to assess, compare and analyze imaging data – from the level of a single laboratory to the broader life sciences community. We propose a data-standard for fluorescence microscopy that can increase the confidence in experimental results, facilitate the exchange of data, and maximize compatibility with current and future data analysis techniques.
Cutting-edge imaging technologies often rely on sophisticated hardware and multi-layered algorithms for reconstruction and analysis. Consequently, the trustworthiness of new methods can be difficult to assess. To evaluate the reliability and limitations of complex methods, quantitative analyses – such as the one present here for the 3D SPEED method – are paramount.
The limited resolution of optical microscopes prevents direct observation of macro- molecules like DNA and RNA. We present a multi-color, achromatic, cryogenic fluorescence microscope that has the potential to produce multi-color images with sub-nanometer precision. This innovation would move fluorescence imaging beyond the limitations of optics and into the world of molecular resolution.
|
Page generated in 0.092 seconds