In the field of radiation protection, complex computationally expensive algorithms are used to predict radiation doses, to organs in the human body from exposure to internally deposited radionuclides. These algorithms contain many inputs, the true values of which are uncertain. Current methods for assessing the effects of the input uncertainties on the output of the algorithms are based on Monte Carlo analyses, i.e. sampling from subjective prior distributions that represent the uncertainty on each input, evaluating the output of the model and calculating sample statistics. For complex computationally expensive algorithms, it is often not possible to get a large enough sample for a meaningful uncertainty analysis. This thesis presents an alternative general theory for uncertainty analysis, based on the use of stochastic process models, in a Bayesian context. The measures provided by the Monte Carlo analysis are obtained, plus extra more informative measures, but using a far smaller sample. The theory is initially developed in a general form and then specifically for algorithms with inputs whose uncertainty can be characterised by independent normal distributions. The Monte Carlo and Bayesian methodologies are then compared using two practical examples. The first example, is based on a simple model developed to calculate doses due to radioactive iodine. This model has two normally distributed uncertain parameters and due to its simplicity an independent measurement of the true uncertainty on the output is available for comparison. This exercise appears to show that the Bayesian methodology is superior in this simple case. The purpose of the second example is to determine if the methodology is practical in a 'real-life' situation and to compare it with a Monte Carlo analysis. A model for calculating doses due to plutonium contamination is used. This model is computationally expensive and has fourteen uncertain inputs. The Bayesian analysis compared favourably to the Monte Carlo, indicating that it has the potential to provide more accurate uncertainty analyses for the parameters of computationally expensive algorithms.
Identifer | oai:union.ndltd.org:bl.uk/oai:ethos.bl.uk:338522 |
Date | January 1997 |
Creators | Haylock, Richard George Edward |
Publisher | University of Nottingham |
Source Sets | Ethos UK |
Detected Language | English |
Type | Electronic Thesis or Dissertation |
Source | http://eprints.nottingham.ac.uk/13193/ |
Page generated in 0.0017 seconds