<p>Cutting edge research problems require the use of complicated and computationally expensive computer models. I will present a practical overview of the design and analysis of computer experiments in high energy nuclear and astro phsyics. The aim of these experiments is to infer credible ranges for certain fundamental parameters of the underlying physical processes through the analysis of model output and experimental data.</p><p>To be truly useful computer models must be calibrated against experimental data. Gaining an understanding of the response of expensive models across the full range of inputs can be a slow and painful process. Gaussian Process emulators can be an efficient and informative surrogate for expensive computer models and prove to be an ideal mechanism for exploring the response of these models to variations in their inputs.</p><p>A sensitivity analysis can be performed on these model emulators to characterize and quantify the relationship between model input parameters and predicted observable properties. The result of this analysis provides the user with information about which parameters are most important and most likely to affect the prediction of a given observable. Sensitivity analysis allow us to identify what model parameters can be most efficiently constrained by the given observational data set.</p><p>In this thesis I describe a range of techniques for the calibration and exploration of the complex and expensive computer models so common in modern physics research. These statistical methods are illustrated with examples drawn from the fields of high energy nuclear physics and galaxy formation.</p> / Dissertation
Identifer | oai:union.ndltd.org:DUKE/oai:dukespace.lib.duke.edu:10161/8782 |
Date | January 2014 |
Creators | Coleman-Smith, Christopher |
Contributors | Müller, Berndt |
Source Sets | Duke University |
Detected Language | English |
Type | Dissertation |
Page generated in 0.0021 seconds