Spelling suggestions: "subject:"fanbeam"" "subject:"andbeam""
1 |
A Methodology for the Design of Spaceborne Pencil-Beam Scatterometer SystemsSpencer, Michael W. 14 May 2003 (has links) (PDF)
Spaceborne scatterometer instruments are important tools for the remote sensing of the Earth's environment. In addition to the primary goal of measuring ocean winds, data from scatterometers have proven useful in the study of a variety of land and cryopshere processes as well. Several satellites carrying scatterometers have flown in the last two decades. These previous systems have been "fan-beam" scatterometers, where multiple antennas placed in fixed positions are used. The fan-beam scatterometer approach, however, has disadvantages which limit its utility for future missions. An alternate approach, the conically-scanning "pencil-beam" scatterometer technique, alleviates many of the problems encountered with earlier systems and provides additional measurement capability. Due to these advantages, the pencil-beam approach has been selected by NASA as the basis for future scatterometer missions. Whereas the fan-beam approach is mature and well understood, there is need for a fundamental study of the unique aspects of the pencil-beam technique.
In this dissertation, a comprehensive treatment of the design issues associated with pencil-beam scatterometers is presented. A new methodology is established for evaluating and optimizing the performance of conically-scanning radar systems. Employing this methodology, key results are developed and used in the design of the SeaWinds instrument - NASA's first pencil-beam scatterometer. Further, the theoretical framework presented in this study is used to propose new scatterometer techniques which will significantly improve the spatial resolution and measurement accuracy of future instruments.
|
2 |
System Optimization and Patient Translational Motion Correction for Reduction of Artifacts in a Fan-Beam CT ScannerWise, Zachary Gordon Lee 19 September 2012 (has links)
No description available.
|
3 |
Image Reconstruction Based On Hilbert And Hybrid Filtered Algorithms With Inverse Distance Weight And No Backprojection WeightNarasimhadhan, A V 08 1900 (has links) (PDF)
Filtered backprojection (FBP) reconstruction algorithms are very popular in the field of X-ray computed tomography (CT) because they give advantages in terms of the numerical accuracy and computational complexity. Ramp filter based fan-beam FBP reconstruction algorithms have the position dependent weight in the backprojection which is responsible for spatially non-uniform distribution of noise and resolution, and artifacts. Many algorithms based on shift variant filtering or spatially-invariant interpolation in the backprojection step have been developed to deal with this issue. However, these algorithms are computationally demanding. Recently, fan-beam algorithms based on Hilbert filtering with inverse distance weight and no weight in the backprojection have been derived using the Hamaker’s relation. These fan-beam reconstruction algorithms have been shown to improve noise uniformity and uniformity in resolution.
In this thesis, fan-beam FBP reconstruction algorithms with inverse distance back-projection weight and no backprojection weight for 2D image reconstruction are presented and discussed for the two fan-beam scan geometries -equi-angular and equispace detector array. Based on the proposed and discussed fan-beam reconstruction algorithms with inverse distance backprojection and no backprojection weight, new 3D cone-beam FDK reconstruction algorithms with circular and helical scan trajectories for curved and planar detector geometries are proposed. To start with three rebinning formulae from literature are presented and it is shown that one can derive all fan-beam FBP reconstruction algorithms from these rebinning formulae. Specifically, two fan-beam algorithms with no backprojection weight based on Hilbert filtering for equi-space linear array detector and one new fan-beam algorithm with inverse distance backprojection weight based on hybrid filtering for both equi-angular and equi-space linear array detector are derived. Simulation results for these algorithms in terms of uniformity of noise and resolution in comparison to standard fan-beam FBP reconstruction algorithm (ramp filter based fan-beam reconstruction algorithm) are presented. It is shown through simulation that the fan-beam reconstruction algorithm with inverse distance in the backprojection gives better noise performance while retaining the resolution properities. A comparison between above mentioned reconstruction algorithms is given in terms of computational complexity.
The state of the art 3D X-ray imaging systems in medicine with cone-beam (CB) circular and helical computed tomography scanners use non-exact (approximate) FBP based reconstruction algorithm. They are attractive because of their simplicity and low computational cost. However, they produce sub-optimal reconstructed images with respect to cone-beam artifacts, noise and axial intensity drop in case of circular trajectory scan imaging. Axial intensity drop in the reconstructed image is due to the insufficient data acquired by the circular-scan trajectory CB CT. This thesis deals with investigations to improve the image quality by means of the Hilbert and hybrid filtering based algorithms using redundancy data for Feldkamp, Davis and Kress (FDK) type reconstruction algorithms. In this thesis, new FDK type reconstruction algorithms for cylindrical detector and planar detector for CB circular CT are developed, which are obtained by extending to three dimensions (3D) an exact Hilbert filtering based FBP algorithm for 2D fan-beam beam algorithms with no position dependent backprojection weight and fan-beam algorithm with inverse distance backprojection weight. The proposed FDK reconstruction algorithm with inverse distance weight in the backprojection requires full-scan projection data while the FDK reconstruction algorithm with no backprojection weight can handle partial-scan data including very short-scan. The FDK reconstruction algorithms with no backprojection weight for circular CB CT are compared with Hu’s, FDK and T-FDK reconstruction algorithms in-terms of axial intensity drop and computational complexity. The simulation results of noise, CB artifacts performance and execution timing as well as the partial-scan reconstruction abilities are presented. We show that FDK reconstruction algorithms with no backprojection weight have better noise performance characteristics than the conventional FDK reconstruction algorithm where the backprojection weight is known to result in spatial non-uniformity in the noise characteristics.
In this thesis, we present an efficient method to reduce the axial intensity drop in circular CB CT. The efficient method consists of two steps: the first one is reconstruction of the object using FDK reconstruction algorithm with no backprojection weight and the second is estimating the missing term. The efficient method is comparable to Zhu et al.’s method in terms of reduction in axial intensity drop, noise and computational complexity.
The helical scanning trajectory satisfies the Tuy-smith condition, hence an exact and stable reconstruction is possible. However, the helical FDK reconstruction algorithm is responsible for the cone-beam artifacts since the helical FDK reconstruction algorithm is approximate in its derivation. In this thesis, helical FDK reconstruction algorithms based on Hilbert filtering with no backprojection weight and FDK reconstruction algorithm based on hybrid filtering with inverse distance backprojection weight are presented to reduce the CB artifacts. These algorithms are compared with standard helical FDK in-terms of noise, CB artifacts and computational complexity.
|
4 |
Absolute quantification in brain SPECT imagingCot Sanz, Albert 17 December 2003 (has links)
Certes malalties neurològiques estan associades amb problemes en els sistemes de neurotransmissió. Una aproximació a l'estudi d'aquests sistemes és la tomografia d'emissió SPECT (Single Photon Emission Computed Tomography) com a tècnicano-invasiva que proporciona imatges funcionals representatives de l'activitat neuronal. Aquesta tècnica permet la visualització i l'anàlisi de diferents òrgans i teixits dins l'àmbit de la Medicina Nuclear.Malgrat que la inspecció visual de la imatge a vegades és suficient per establir el diagnòstic, la quantificació dels paràmetres de la imatge reconstruida poden millorar la fiabilitat i exactitud del diagnòstic precoç de la malaltia. En particular, la quantificació d'estudis de neurotransmissors de dopamina pot ajudar a detectar els estadis inicials de malalties com el Parkinson. Així mateix, la quantificació permet un seguiment més acurat de l'evolució de la malaltia i una evaluació dels efectes de la terapèutica aplicada.La quantificació es veu afectada pels efectes degradants de la imatge com són el soroll estadístic, la resposta del sistema col.limador/detector i l'efecte de dispersió i/o atenuació dels fotons en la seva interacció amb la matèria. Alguns d'aquests efectes poden ser corregits mitjançant l'ús d'algoritmes de reconstrucció iteratius.L'objectiu d'aquesta tesi és aconseguir una quantificació tant absoluta com relativa dels valors numèrics de la imatge reconstruida de manera que reprodueixin la distribució d'activitat real del pacient en el moment de l'adquisició de l'estudi de SPECT. Per aconseguir-ho s'han desenvolupat diferents codis i algoritmes per millorar els mètodes de reconstrucció existents i validar-ne els seus resultats.La validació i millora dels algoritmes s'ha basat en l'ús de tècniques de simulació Monte Carlo. S'han analitzat els diferents codis Monte Carlo disponibles en l'àmbit de la Medicina Nuclear i s'ha escollit SimSET. La interpretació dels resultats obtinguts i la comparació amb els resultats experimentals ens van dur a incorporar modificacions en el codi original. D'aquesta manera vam obtenir i validar SimSET com a generador d'estudis de SPECT a partir de pacients i objectes virtuals.La millora dels algoritmes es va basar en la incorporació de models analítics de la resposta del sistema col.limador/detector. La modelització del sistema es va implementar per diferents configuracions i energies de la font amb la utilització del codi Monte Carlo PENELOPE. Així mateix es va dissenyar un nou algoritme iteratiu que incorporés l'efecte 3D del sistema i es va tenir en compte la valoració de la imatge en tot el seu volum.Finalment, es va proposar una correcció de l'scattering utilitzant el simulador SimSET modificat per tal d'accelerar el procés de reconstrucció. Els valors reconstruits de la imatge ens han permès recuperar més d'un 95\% dels valors originals, permetent per tant la quantificació absoluta de les imatges de SPECT. / Many forms of brain diseases are associated with problems in the neurotransmission systems. One approach to the assessment of such systems is the use of Single Photon Emission Computed Tomography (SPECT) brain imaging. Neurotransmission SPECT has become an important tool in neuroimaging and is today regarded as a useful method in both clinical and basic research. SPECT is able to non-invasively visualize and analyze different organs and tissues functions or properties in Nuclear Medicine.Although visual inspection is often sufficient to assess neurotransmission imaging, quantification might improve the diagnostic accuracy of SPECT studies of the dopaminergic system. In particular, quantification of neurotransmission SPECT studies in Parkinson Disease could help us to diagnose this illness in the early pre-clinical stages. One of the main research topics in SPECT is to achieve early diagnosis, indeed preclinical diagnosis in neurodegenerative illnesses. In this field detailed analysis of shapes and values of the region of interest (ROIs) of the image is important, thus quantification is needed. Moreover, quantification allows a follow-up of the progression of disease and to assess the effects of potential neuroprotective treatment strategies. Therefore, the aim of this thesis is to achieve quantification of both the absolute activity values and the relative values of the reconstructed SPECT images.Quantification is affected by the degradation of the image introduced by statistical noise, attenuation, collimator/detector response and scattering effects. Some of these degradations may be corrected by using iterative reconstruction algorithms, which thus enable a more reliable quantification. The importance of correcting degradations in reconstruction algorithms to improve quantification accuracy of brain SPECT studies has been proved.Monte Carlo simulations are the --gold standard' for testing reconstruction algorithms in Nuclear Medicine. We analyzed the available Monte Carlo codes and we chose SimSET as a virtual phantom simulator. A new stopping criteria in SimSET was established in order to reduce the simulation time. The modified SimSET version was validated as a virtual phantom simulator which reproduces realistic projection data sets in SPECT studies.Iterative algorithms permit modelling of the projection process, allowing for correction of spatially variant collimator response and the photon crosstalk effect between transaxial slices. Thus, our work was focused on the modelling of the collimator/detector response for the parallel and fan beam configurations using the Monte Carlo code PENELOPE. Moreover, a full 3D reconstruction with OS-EM algorithms was developed.Finally, scattering has recognized to be one of the most significant degradation effects in SPECT quantification. Nowadays this subject is an intensive field of research in SPECT techniques. Monte Carlo techniques appear to be the most reliable way to include this correction. The use of the modified SimSET simulator accelerates the forward projection process although the computational burden is already a challenge for this technique.Full 3D reconstruction simultaneously applied with Monte Carlo-based scattering correction and the 3D evaluation procedure is a major upgrade technique in order to obtain valuable, absolute quantitative estimates of the reconstructed images. Once all the degrading effects were corrected, the obtained values were 95\% of the theoretical values. Thus, the absolute quantification was achieved.
|
5 |
Study and design of new multibeam antenna architectures in Ku and Ka bands for broadband satellite applications / Étude de nouvelles architectures d'antennes multifaisceaux en bande Ka pour les télécommunications par satellite à très haut débitDiallo, Cheikh-Dieylar 19 December 2016 (has links)
Les antennes multifaisceaux (AMFs) sont cruciales pour les applications de télécommunications par satellite modernes et futures, civiles et militaires. La partie basse du spectre électromagnétique est saturée alors que de larges bandes de fréquences sont disponibles dans la bande Ka, dans laquelle des missions à très-haut débit ont émergées au cours de la dernière décennie. La tendance consiste à réduire la taille des spots pour les couvertures multi-spots afin de diminuer le prix des satellites. Ainsi des antennes d’ouverture de plus en plus grande électriquement sont requises, induisant des ruptures technologiques majeures. Les lentilles de Luneburg insérées dans un guide d’ondes à plans parallèles (GOPP) deux plaques métalliques parallèles (PMPs) sont des solutions attractives pour illuminer les AMFs, puisqu’elles peuvent aboutir à des formateurs de faisceaux de bande et champ de visée larges, pertes et coûts faibles, et simples à concevoir, réaliser et intégrer. Les travaux de cette thèse portent sur le développement de nouvelles méthodes d’implémentation et sur la conception de AMFs à base de lentille de Luneburg. La réalisation de la lentille de Luneburg est connue pour être un défi technologique majeur. Un état de l’art des méthodes de réalisation est fourni. Ensuite, deux nouvelles méthodes sont proposées, ainsi qu’une méthode et des outils de conception. La première méthode de réalisation consiste en une matrice périodique et régulière de plots métalliques de taille inférieure à la longueur d’onde, et où la séparation du GOPP varie. La hauteur des plots et la séparation du GOPP contrôlent la valeur de l’indice de réfraction équivalente. L’antenne à 9 faisceaux tout métal conçue, fabriquée et mesurée, comporte 8314 plots et présente d’excellentes performances, notamment meilleures que sa version à séparation de plaques constante. La seconde méthode de réalisation consiste en une matrice périodique et régulière de trous circulaires de taille inférieure à la longueur d’onde réalisés sur un des deux revêtements cuivrés d’un substrat diélectrique plus une plaque métallique supérieure séparée du plan des trous par une couche d’air d’épaisseur fixe. L’antenne à 5 faisceaux conçue comporte 2696 trous et présente de très bonnes performances comparés à ces semblables dans la littérature. / Multi-beam antennas (MBAs) are crucial to modern and future, civilian and military satellite telecommunications applications. The low part of the electromagnetic spectrum is congested, while wide band of frequencies are available in the Ka-band, in which broadband missions have emerged in the last decade. The trend is reducing the size of spots in multi-beam coverage to reduce the cost of satellites, hence more electrically large antennas are needed, with major technological breakthrough as a consequence. Luneburg lenses in parallel-plate waveguide (PPW) are attractive solutions to excite MBAs, since they could lead to wide band and field-of-view, low loss and cost, easy to design, manufacture and accommodate Beam Forming Networks. This PhD deals with the development of novel implementations and the design of broadband, low loss and wide field-of-view Luneburg lens based MBAs. The implementation of the Luneburg lens is known as a major technological challenge. A state-of-the art of the implementation techniques is presented. Then two novel implementations of Luneburg lens in PPW environment are proposed, like design method, process and tools. The first implementation consists of a periodic and regular array of subwavelength vertical metal posts, where the PPW spacing is variable. The post height and PPW spacing modulate the equivalent refractive index. The all-metal 9-beams antenna designed, manufactured and measured, has 8314 posts and shows excellent performances, better than the traditional constant PPW spacing version. The second implementation consists of periodic and regular array of subwavelength circular holes etched on the copper cladding of a dielectric substrate with an air gap between the holes plane and the PPW top plate. The radius of the holes control the equivalent index. The 5-beams antenna designed has 2696 holes and shows very good performances as compared to similar devices in literature.
|
6 |
Performance Evaluation Of Fan-beam And Cone-beam Reconstruction Algorithms With No Backprojection Weight On Truncated Data ProblemsSumith, K 07 1900 (has links) (PDF)
This work focuses on using the linear prediction based projection completion for the fan-beam and cone-beam reconstruction algorithm with no backprojection weight. The truncated data problems are addressed in the computed tomography research. However, the image reconstruction from truncated data perfectly has not been achieved yet and only approximately accurate solutions have been obtained. Thus research in this area continues to strive to obtain close result to the perfect. Linear prediction techniques are adopted for truncation completion in this work, because previous research on the truncated data problems also have shown that this technique works well compared to some other techniques like polynomial fitting and iterative based methods. The Linear prediction technique is a model based technique. The autoregressive (AR) and moving average (MA) are the two important models along with autoregressive moving average (ARMA) model. The AR model is used in this work because of the simplicity it provides in calculating the prediction coefficients. The order of the model is chosen based on the partial autocorrelation function of the projection data proved in the previous researches that have been carried out in this area of interest. The truncated projection completion using linear prediction and windowed linear prediction show that reasonably accurate reconstruction is achieved. The windowed linear prediction provide better estimate of the missing data, the reason for this is mentioned in the literature and is restated for the reader’s convenience in this work.
The advantages associated with the fan-beam reconstruction algorithms with no backprojection weights compared to the fan-beam reconstruction algorithm with backprojection weights motivated us to use the fan-beam reconstruction algorithm with no backprojection weight for reconstructing the truncation completed projection data. The results obtained are compared with the previous work which used conventional fan-beam reconstruction algorithms with backprojection weight. The intensity plots and the noise performance results show improvements resulting from using the fan-beam reconstruction algorithm with no backprojection weight. The work is also extended to the Feldkamp, Davis, and Kress (FDK) reconstruction algorithm with no backprojection weight for the helical scanning geometry and the results obtained are compared with the FDK reconstruction algorithm with backprojection weight for the helical scanning geometry.
|
Page generated in 0.0363 seconds