May, Thomas Joseph2015-07-242015-07-242015-07-23vt_gsexam:5180http://hdl.handle.net/10919/54593Bayesian parameter estimation is a popular method to address inverse problems. However, since prior distributions are chosen based on expert judgement, the method can inherently introduce bias into the understanding of the parameters. This can be especially relevant in the case of distributed parameters where it is difficult to check for error. To minimize this bias, we develop the idea of a minimally corrective, approximately recovering prior (MCAR prior) that generates a guide for the prior and corrects the expert supplied prior according to that guide. We demonstrate this approach for the 1D elliptic equation or the elliptic partial differential equation and observe how this method works in cases with significant and without any expert bias. In the case of significant expert bias, the method substantially reduces the bias and, in the case with no expert bias, the method only introduces minor errors. The cost of introducing these small errors for good judgement is worth the benefit of correcting major errors in bad judgement. This is particularly true when the prior is only determined using a heuristic or an assumed distribution.ETDIn CopyrightBayesian Parameter EstimationMinimally Corrective PriorsDistributed ParametersElliptic EquationKarhunen-Loeve Theorem.Minimally Corrective, Approximately Recovering Priors to Correct Expert Judgement in Bayesian Parameter EstimationThesis