1 |
An investigation into the use of scattered photons to improve 2D Position Emission Tomography (PET) functional imaging qualitySun, Hongyan January 2012 (has links)
Positron emission tomography (PET) is a powerful metabolic imaging modality, which is designed to detect two anti-parallel 511 keV photons origniating from a positron-electron annihilation. However, it is possible that one or both of the annihilation photons undergo a Compton scattering in the object. This is more serious for a scanner operated in 3D mode or with large patients, where the scatter fraction can be as high as 40-60%. When one or both photons are scattered, the line of response (LOR) defined by connecting the two relevant detectors no longer passes through the annihilation position. Thus, scattered coincidences degrade image contrast and compromise quantitative accuracy. Various scatter correction methods have been proposed but most of them are based on estimating and subtracting the scatter from the measured data or incorporating it into an iterative reconstruction algorithm.
By accurately measuring the scattered photon energy and taking advantage of the kinematics of Compton scattering, two circular arcs (TCA) in 2D can be identified, which describe the locus of all the possible scattering positions and encompass the point of annihilation. In the limiting case where the scattering angle approaches zero, the TCA approach the LOR for true coincidences. Based on this knowledge, a Generalized Scatter (GS) reconstruction algorithm has been developed in this thesis, which can use both true and scattered coincidences to extract the activity distribution in a consistent way. The annihilation position within the TCA can be further confined by adding a patient outline as a constraint into the GS algorithm. An attenuation correction method for the scattered coincidences was also developed in order to remove the imaging artifacts. A geometrical model that characterizes the different probabilities of the annihilation positions within the TCA was also proposed. This can speed up image convergence and improve reconstructed image quality. Finally, the GS algorithm has been adapted to deal with non-ideal energy resolutions. In summary, an algorithm that implicitly incorporates scattered coincidences into the image reconstruction has been developed. Our results demonstrate that this eliminates the need for scatter correction and can improve system sensitivity and image quality. / February 2016
|
2 |
Computational Methods for Protein Structure Comparison and AnalysisXusi Han (8797445) 05 May 2020 (has links)
Proteins are involved in almost all functions in a living cell, and functions of proteins are realized by their tertiary structures. Protein three-dimensional structures can be solved by multiple experimental methods, but computational approaches serve as an important complement to experimental methods for comparing and analyzing protein structures. Protein structure comparison allows the transfer of knowledge about known proteins to a novel protein and plays an important role in function prediction. Obtaining a global perspective of the variety and distribution of protein structures also lays a foundation for our understanding of the building principle of protein structures. This dissertation introduces our computational method to compare protein 3D structures and presents a novel mapping of protein shapes that represents the variety and the similarities of 3D shapes of proteins and their assemblies. The methods developed in this work can be applied to obtain new biological insights into protein atomic structures and electron density maps.
|
3 |
Data Fusion for the Problem of Protein Sidechain AssignmentLei, Yang 01 January 2010 (has links) (PDF)
In this thesis, we study the problem of protein side chain assignment (SCA) given
multiple sources of experimental and modeling data. In particular, the mechanism
of X-ray crystallography (X-ray) is re-examined using Fourier analysis, and a novel
probabilistic model of X-ray is proposed for SCA's decision making. The relationship
between the measurements in X-ray and the desired structure is reformulated in terms
of Discrete Fourier Transform (DFT). The decision making is performed by developing
a new resolution-dependent electron density map (EDM) model and applying
Maximum Likelihood (ML) estimation, which simply reduces to the Least Squares
(LS) solution. Calculation of the condence probability associated with this decision
making is also given. One possible extension of this novel model is the real-space
refinement when the continuous conformational space is used.
Furthermore, we present a data fusion scheme combining multi-sources of data
to solve SCA problem. The merit of our framework is the capability of exploiting
multi-sources of information to make decisions in a probabilistic perspective based on
Bayesian inference. Although our approach aims at SCA problem, it can be easily
transplanted to solving for the entire protein structure.
|
Page generated in 0.0919 seconds