Wycoff, Nathan Benjamin2021-08-212021-08-212021-08-20vt_gsexam:31835http://hdl.handle.net/10919/104683Emulation of computer experiments via surrogate models can be difficult when the number of input parameters determining the simulation grows any greater than a few dozen. In this dissertation, we explore dimension reduction in the context of computer experiments. The active subspace method is a linear dimension reduction technique which uses the gradients of a function to determine important input directions. Unfortunately, we cannot expect to always have access to the gradients of our black-box functions. We thus begin by developing an estimator for the active subspace of a function using kernel methods to indirectly estimate the gradient. We then demonstrate how to deploy the learned input directions to improve the predictive performance of local regression models by ``undoing" the active subspace. Finally, we develop notions of sensitivities which are local to certain parts of the input space, which we then use to develop a Bayesian optimization algorithm which can exploit locally important directions.ETDIn CopyrightSensitivity AnalysisNonparametric RegressionDimension ReductionDerivative Free OptimizationGradient-Based Sensitivity Analysis with KernelsDissertation