Bayesian parameter estimation is a popular method to address inverse problems. However, since prior distributions are chosen based on expert judgement, the method can inherently introduce bias into the understanding of the parameters. This can be especially relevant in the case of distributed parameters where it is difficult to check for error. To minimize this bias, we develop the idea of a minimally corrective, approximately recovering prior (MCAR prior) that generates a guide for the prior and corrects the expert supplied prior according to that guide. We demonstrate this approach for the 1D elliptic equation or the elliptic partial differential equation and observe how this method works in cases with significant and without any expert bias. In the case of significant expert bias, the method substantially reduces the bias and, in the case with no expert bias, the method only introduces minor errors. The cost of introducing these small errors for good judgement is worth the benefit of correcting major errors in bad judgement. This is particularly true when the prior is only determined using a heuristic or an assumed distribution. / Master of Science
Identifer | oai:union.ndltd.org:VTETD/oai:vtechworks.lib.vt.edu:10919/54593 |
Date | 23 July 2015 |
Creators | May, Thomas Joseph |
Contributors | Mathematics, Zietsman, Lizette, Borggaard, Jeffrey T., Rossi, John F. |
Publisher | Virginia Tech |
Source Sets | Virginia Tech Theses and Dissertation |
Detected Language | English |
Type | Thesis |
Format | ETD, application/pdf |
Rights | In Copyright, http://rightsstatements.org/vocab/InC/1.0/ |
Page generated in 0.0022 seconds