• Refine Query
  • Source
  • Publication year
  • to
  • Language
  • 6
  • 2
  • 1
  • Tagged with
  • 8
  • 8
  • 8
  • 3
  • 3
  • 3
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • 2
  • About
  • The Global ETD Search service is a free service for researchers to find electronic theses and dissertations. This service is provided by the Networked Digital Library of Theses and Dissertations.
    Our metadata is collected from universities around the world. If you manage a university/consortium/country archive and want to be added, details can be found on the NDLTD website.
1

A MATHEMATICAL MODEL OF SINGLE-PHOTON EMISSION COMPUTED TOMOGRAPHY (RADON TRANSFORM, COMPTON SCATTER, ATTENUATION, NUCLEAR MEDICINE).

CLOUGH, ANNE VIRGINIA. January 1986 (has links)
Single-photon emission computed tomography (SPECT) is a nuclear-medicine imaging technique that has been shown to provide clinically useful images of radionuclide distributions within the body. The problem of quantitative determination of tomographic activity images from a projection data set leads to a mathematical inverse problem which is formulated as an integral equation. The solution of this problem then depends on an accurate mathematical model as well as a reliable and efficient inversion algorithm. The effects of attenuation and Compton scatter within the body have been incorporated into the model in the hopes of providing a more physically realistic mathematical model. The attenuated Radon transform is the mathematical basis of SPECT. In this work, the case of constant attenuation is reviewed and a new proof of the Tretiak-Metz algorithm is presented. A space-domain version of the inverse attenuated Radon transform is derived. A special case of this transform that is applicable when the object is rotationally symmetric, the attenuated Abel transform is derived, and its inverse is found. A numerical algorithm for the implementation of the inverse attenuated Radon transform with constant attenuation is described and computer simulations are performed to demonstrate the results of the inversion procedure. With the use of the single-scatter approximation and an energy-windowed detector, the effects of Compton scatter are incorporated into the model. The data is then taken to be the sum of primary photons and single-scattered photons. The scattered photons are modeled by a scatter operator acting on the original activity distribution within the object where the operator consists of convolution with a given analytic kernel followed by a boundary cut-off operation. A solution is given by first applying the inverse attenuated Radon transform to the data set. This leads to a Fredholm integral equation to which a Neumann series solution is constructed. Again simulations are performed to validate the accuracy of the assumptions within the model as well as to numerically demonstrate the reconstruction procedure.
2

Physiology-based Mathematical Models for the Intensive Care Unit: Application to Mechanical Ventilation

Albanese, Antonio January 2014 (has links)
This work takes us a step closer to realizing personalized medicine, complementing empirical and heuristic way in which clinicians typically work. This thesis presents mechanistic models of physiology. These models, given continuous signals from a patient, can be fine-tuned via parameter estimation methods so that the model's outputs match the patient's. We thus obtain a virtual patient mimicking the patient at hand. Therapeutic scenarios can then be applied and optimal diagnosis and therapy can thus be attained. As such, personalized medicine can then be achieved without resorting to costly genetics. In particular we have developed a novel comprehensive mathematical model of the cardiopulmonary system that includes cardiovascular circulation, respiratory mechanics, tissue and alveolar gas exchange, as well as short-term neural control. Validity of the model was proven by the excellent agreement with real patient data, under normo-physiological as well as hypercapnic and hypoxic conditions, taken from literature. As a concrete example, a submodel of the lung mechanics was fine-tuned using real patient data and personalized respiratory parameters (resistance, R_rs, and compliance, C_rs) were estimated continually. This allows us to compute the patient's effort (Work of Breathing), continuously and more importantly noninvasively. Finally, the use of Bayesian estimation techniques, which allow incorporation of population studies and prior information about model's parameters, was proposed in the contest of patient-specific physiological models. A Bayesian Maximum a Posteriori Probability (MAP) estimator was implemented and applied to a case-study of respiratory mechanics. Its superiority against the classical Least Squares method was proven in data-poor conditions using both simulated and real animal data. This thesis can serve as a platform for a plethora of applications for cardiopulmonary personalized medicine.
3

THREE-DIMENSIONAL RADIOGRAPHIC IMAGING

Chiu, Ming-Yee January 1980 (has links)
No description available.
4

The Advantages of Collimator Optimization for Intensity Modulated Radiation Therapy

Unknown Date (has links)
The goal of this study was to improve dosimetry for pelvic, lung, head and neck, and other cancers sites with aspherical planning target volumes (PTV) using a new algorithm for collimator optimization for intensity modulated radiation therapy (IMRT) that minimizes the x-jaw gap (CAX) and the area of the jaws (CAA) for each treatment field. A retroactive study on the effects of collimator optimization of 20 patients was performed by comparing metric results for new collimator optimization techniques in Eclipse version 11.0. Keeping all other parameters equal, multiple plans are created using four collimator techniques: CA0, all fields have collimators set to 0°, CAE, using the Eclipse collimator optimization, CAA, minimizing the area of the jaws around the PTV, and CAX, minimizing the x-jaw gap. The minimum area and the minimum x-jaw angles are found by evaluating each field beam’s eye view of the PTV with ImageJ and finding the desired parameters with a custom script. The evaluation of the plans included the monitor units (MU), the maximum dose of the plan, the maximum dose to organs at risk (OAR), the conformity index (CI) and the number of fields that are calculated to split. Compared to the CA0 plans, the monitor units decreased on average by 6% for the CAX method with a p-value of 0.01 from an ANOVA test. The average maximum dose remained within 1.1% difference between all four methods with the lowest given by CAX. The maximum dose to the most at risk organ was best spared by the CAA method, which decreased by 0.62% compared to the CA0. Minimizing the x-jaws significantly reduced the number of split fields from 61 to 37. In every metric tested the CAX optimization produced comparable or superior results compared to the other three techniques. For aspherical PTVs, CAX on average reduced the number of split fields, lowered the maximum dose, minimized the dose to the surrounding OAR, and decreased the monitor units. This is achieved while maintaining the same control of the PTV. / Includes bibliography. / Dissertation (Ph.D.)--Florida Atlantic University, 2017. / FAU Electronic Theses and Dissertations Collection
5

Empirical beam angle optimization for lung cancer intensity modulated radiation therapy

Unknown Date (has links)
Empirical methods of beam angle optimization (BAO) are tested against the BAO that is currently employed in Eclipse treatment planning software. Creating an improved BAO can decrease the amount of time a dosimetrist spends on making a treatment plan, improve the treatment quality and enhance the tools an inexperienced dosimetrist can use to develop planning techniques. Using empirical data created by experienced dosimetrists from 69 patients treated for lung cancer, the most frequently used gantry angles were applied to four different regions in each lung to gather an optimal set of fields that could be used to treat future lung cancer patients. This method, given the moniker FAU BAO, is compared in 7 plans created with the Eclipse BAO choosing 5 fields and 9 fields. The results show that the conformality index improved by 30% or 3% when using the 5 and 9 fields. The conformation number was better by 12% from the 5 fields and 9% from the 9 fields. The organs at risk (OAR) were overall more protected to produce fewer nonstochastic effects from the radiation treatment with the FAU BAO. / Includes bibliography. / Thesis (M.S.)--Florida Atlantic University, 2014. / FAU Electronic Theses and Dissertations Collection
6

Bootstrap distribution for testing a change in the cox proportional hazard model.

January 2000 (has links)
Lam Yuk Fai. / Thesis (M.Phil.)--Chinese University of Hong Kong, 2000. / Includes bibliographical references (leaves 41-43). / Abstracts in English and Chinese. / Chapter 1 --- Basic Concepts --- p.9 / Chapter 1.1 --- Survival data --- p.9 / Chapter 1.1.1 --- An example --- p.9 / Chapter 1.2 --- Some important functions --- p.11 / Chapter 1.2.1 --- Survival function --- p.12 / Chapter 1.2.2 --- Hazard function --- p.12 / Chapter 1.3 --- Cox Proportional Hazards Model --- p.13 / Chapter 1.3.1 --- A special case --- p.14 / Chapter 1.3.2 --- An example (continued) --- p.15 / Chapter 1.4 --- Extension of the Cox Proportional Hazards Model --- p.16 / Chapter 1.5 --- Bootstrap --- p.17 / Chapter 2 --- A New Method --- p.19 / Chapter 2.1 --- Introduction --- p.19 / Chapter 2.2 --- Definition of the test --- p.20 / Chapter 2.2.1 --- Our test statistic --- p.20 / Chapter 2.2.2 --- The alternative test statistic I --- p.22 / Chapter 2.2.3 --- The alternative test statistic II --- p.23 / Chapter 2.3 --- Variations of the test --- p.24 / Chapter 2.3.1 --- Restricted test --- p.24 / Chapter 2.3.2 --- Adjusting for other covariates --- p.26 / Chapter 2.4 --- Apply with bootstrap --- p.28 / Chapter 2.5 --- Examples --- p.29 / Chapter 2.5.1 --- Male mice data --- p.34 / Chapter 2.5.2 --- Stanford heart transplant data --- p.34 / Chapter 2.5.3 --- CGD data --- p.34 / Chapter 3 --- Large Sample Properties and Discussions --- p.35 / Chapter 3.1 --- Large sample properties and relationship to goodness of fit test --- p.35 / Chapter 3.1.1 --- Large sample properties of A and Ap --- p.35 / Chapter 3.1.2 --- Large sample properties of Ac and A --- p.36 / Chapter 3.2 --- Discussions --- p.37
7

Automated Extraction of Subdural Grid Electrodes from Post-Implant MRI Scans for Epilepsy Surgery

Pozdin, Maksym O. 13 May 2004 (has links)
The objective of the current research was to develop an automated algorithm with no or little user assistance for extraction of Subdural Grid Electrodes (SGE) from post-implant MRI scans for epilepsy surgery. The algorithm utilizes the knowledge about the artifacts created by Subdural Electrodes (SE) in MRI scans. Also the algorithm does not only extract individual electrodes, but it also extracts them as a SGE structures. Information about the number and type of implanted electrodes is recorded during the surgery [1]. This information is used to reduce the search space and produce better results. Currently, the extraction of SGE from post-implant MRI scans is performed manually by a technologist [1, 2, 3]. It is a time-consuming process, requiring on average a few hours, depending on the number of implanted SE. In addition, the process does not conserve the geometry of the structures, since electrodes are identified individually. Usually SGE extraction is complicated by nearby artifacts, making manual extraction a non-trivial task that requires a good visualization of 3D space and orientation of SGE in it. Currently, most of the technologists use 2D slice viewers for extraction of SGE from 3D MRI scans. There is no commercial software to perform the automated extraction task. The only algorithm suggested in the literature is [4]. The goal of the proposed algorithm is to improve the performance of the algorithm in [4]. As a goal, the proposed algorithm performs extraction of SGE not only for individual electrodes, but by applying geometric constraints on SGE.
8

Development and use of a Monte Carlo-Markov cycle tree model for coronary heart disease incidence-mortality and health service usage with explicit recognition of coronary artery revascularization procedures (CARPs)

Mannan, Haider Rashid January 2008 (has links)
[Truncated abstract] The main objective of this study was to develop and validate a demographic/epidemiologic Markov model for population modelling/forecasting of CARPs as well as CHD deaths and incidence in Western Australia using population, linked hospital morbidity and mortality data for WA over the period 1980 to 2000. A key feature of the model was the ability to count events as individuals moved from one state to another and an important aspect of model development and implementation was the method for estimation of model transition probabilities from available population data. The model was validated through comparison of model predictions with actual event numbers and through demonstration of its use in producing forecasts under standard extrapolation methods for transition probabilities as well as improving the forecasts by taking into account various possible changes to the management of CHD via surgical treatment changes. The final major objective was to demonstrate the use of model for performing sensitivity analysis of some scenarios. In particular, to explore the possible impact on future numbers of CARPs due to improvements in surgical procedures, particularly the introduction of drug eluting stents, and to explore the possible impact of change in trend of CHD incidence as might be caused by the obesity epidemic. ... When the effectiveness of PCI due to introduction of DES was increased by reducing Pr(CABG given PCI) and Pr(a repeat PCI), there was a small decline in the requirements for PCIs and the effect seemed to have a lag. Finally, in addition to these changes when other changes were incorporated which captured that a PCI was used more than a CABG due to a change in health policy after the introduction of DES, there was a small increase in the requirements for PCIs with a lag in the effect. Four incidence scenarios were developed for assessing the effect of change in secular trends of CHD incidence as might be caused by the obesity epidemic in such a way that they gradually represented an increasing effect of obesity epidemic (assuming that other risk factors changed favourably) on CHD incidence. The strategy adopted for developing the scenarios was that based on past trends the most dominant component of CHD incidence was first gradually altered and finally the remaining components were altered. iv The results showed that if the most dominant component of CHD incidence, eg, Pr(CHD - no history of CHD) levelled off and the trends in all other transition probabilities continued into future, then the projected numbers of CABGs and PCIs for 2001-2005 were insensitive to these changes. Even increasing this probability by as much as 20 percent did not alter the results much. These results implied that the short-term effect on projected numbers of CARPs caused by an increase in the most dominant component of CHD incidence, possibly due to an ?obesity epidemic, is small. In the final incidence scenario, two of the remaining CHD incidence components-Pr(CABG - no history of CHD) and Pr(CHD death - no CHD and no history of CHD) were projected to level off over 2001-2005 because these probabilities were declining over the baseline period of 1998-2000. The projected numbers of CABGs were more sensitive (compared to the previous scenarios) to these changes but PCIs were not.

Page generated in 0.1886 seconds