• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 1
  • Tagged with
  • 2
  • 2
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • 1
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

Quantification and Maximization of Performance Measures for Photon Counting Spectral Computed Tomography

Yveborg, Moa January 2015 (has links)
During my time as a PhD student at the Physics of Medical Imaging group at KTH, I have taken part in the work of developing a photon counting spectrally resolved silicon detector for clinical computed tomography. This work has largely motivated the direction of my research, and is the main reason for my focus on certain issues. Early in the work, a need to quantify and optimize the performance of a spectrally resolved detector was identified. A large part of my work have thus consisted of reviewing conventional methods used for performance quantification and optimization in computed tomography, and identifying which are best suited for the characterization of a spectrally resolved system. In addition, my work has included comparisons of conventional systems with the detector we are developing. The collected result after a little more than four years of work are four publications and three conference papers. This compilation thesis consists of five introductory chapters and my four publications. The introductory chapters are not self-contained in the sense that the theory and results from all my published work are included. Rather, they are written with the purpose of being a context in which the papers should be read. The first two chapters treat the general purpose of the introductory chapters, and the theory of computed tomography including the distinction between conventional, non-spectral, computed tomography, and different practical implementations of spectral computed tomography. The second chapter consists of a review of the conventional methods developed for quantification and optimization of image quality in terms of detectability and signal-to-noise ratio, part of which are included in my published work. In addition, the theory on which the method of material basis decomposition is based on is presented, together with a condensed version of the results from my work on the comparison of two systems with fundamentally different practical solutions for material quantification. In the fourth chapter, previously unpublished measurements on the photon counting spectrally resolved detector we are developing are presented, and compared to Monte Carlo simulations. In the fifth and final chapter, a summary of the appended publications is included. / <p>QC 20150303</p>
2

Scalable Estimation and Testing for Complex, High-Dimensional Data

Lu, Ruijin 22 August 2019 (has links)
With modern high-throughput technologies, scientists can now collect high-dimensional data of various forms, including brain images, medical spectrum curves, engineering signals, etc. These data provide a rich source of information on disease development, cell evolvement, engineering systems, and many other scientific phenomena. To achieve a clearer understanding of the underlying mechanism, one needs a fast and reliable analytical approach to extract useful information from the wealth of data. The goal of this dissertation is to develop novel methods that enable scalable estimation, testing, and analysis of complex, high-dimensional data. It contains three parts: parameter estimation based on complex data, powerful testing of functional data, and the analysis of functional data supported on manifolds. The first part focuses on a family of parameter estimation problems in which the relationship between data and the underlying parameters cannot be explicitly specified using a likelihood function. We introduce a wavelet-based approximate Bayesian computation approach that is likelihood-free and computationally scalable. This approach will be applied to two applications: estimating mutation rates of a generalized birth-death process based on fluctuation experimental data and estimating the parameters of targets based on foliage echoes. The second part focuses on functional testing. We consider using multiple testing in basis-space via p-value guided compression. Our theoretical results demonstrate that, under regularity conditions, the Westfall-Young randomization test in basis space achieves strong control of family-wise error rate and asymptotic optimality. Furthermore, appropriate compression in basis space leads to improved power as compared to point-wise testing in data domain or basis-space testing without compression. The effectiveness of the proposed procedure is demonstrated through two applications: the detection of regions of spectral curves associated with pre-cancer using 1-dimensional fluorescence spectroscopy data and the detection of disease-related regions using 3-dimensional Alzheimer's Disease neuroimaging data. The third part focuses on analyzing data measured on the cortical surfaces of monkeys' brains during their early development, and subjects are measured on misaligned time markers. In this analysis, we examine the asymmetric patterns and increase/decrease trend in the monkeys' brains across time. / Doctor of Philosophy / With modern high-throughput technologies, scientists can now collect high-dimensional data of various forms, including brain images, medical spectrum curves, engineering signals, and biological measurements. These data provide a rich source of information on disease development, engineering systems, and many other scientific phenomena. The goal of this dissertation is to develop novel methods that enable scalable estimation, testing, and analysis of complex, high-dimensional data. It contains three parts: parameter estimation based on complex biological and engineering data, powerful testing of high-dimensional functional data, and the analysis of functional data supported on manifolds. The first part focuses on a family of parameter estimation problems in which the relationship between data and the underlying parameters cannot be explicitly specified using a likelihood function. We introduce a computation-based statistical approach that achieves efficient parameter estimation scalable to high-dimensional functional data. The second part focuses on developing a powerful testing method for functional data that can be used to detect important regions. We will show nice properties of our approach. The effectiveness of this testing approach will be demonstrated using two applications: the detection of regions of the spectrum that are related to pre-cancer using fluorescence spectroscopy data and the detection of disease-related regions using brain image data. The third part focuses on analyzing brain cortical thickness data, measured on the cortical surfaces of monkeys’ brains during early development. Subjects are measured on misaligned time-markers. By using functional data estimation and testing approach, we are able to: (1) identify asymmetric regions between their right and left brains across time, and (2) identify spatial regions on the cortical surface that reflect increase or decrease in cortical measurements over time.

Page generated in 0.0793 seconds