VTechWorks staff will be away for the Thanksgiving holiday beginning at noon on Wednesday, November 27, through Friday, November 29. We will resume normal operations on Monday, December 2. Thank you for your patience.
 

Gradient-Based Sensitivity Analysis with Kernels

dc.contributor.authorWycoff, Nathan Benjaminen
dc.contributor.committeechairGramacy, Robert B.en
dc.contributor.committeememberWild, Stefan M.en
dc.contributor.committeememberBinois, Mickaelen
dc.contributor.committeememberLeman, Scotland C.en
dc.contributor.committeememberHigdon, Daviden
dc.contributor.departmentStatisticsen
dc.date.accessioned2021-08-21T08:00:07Zen
dc.date.available2021-08-21T08:00:07Zen
dc.date.issued2021-08-20en
dc.description.abstractEmulation of computer experiments via surrogate models can be difficult when the number of input parameters determining the simulation grows any greater than a few dozen. In this dissertation, we explore dimension reduction in the context of computer experiments. The active subspace method is a linear dimension reduction technique which uses the gradients of a function to determine important input directions. Unfortunately, we cannot expect to always have access to the gradients of our black-box functions. We thus begin by developing an estimator for the active subspace of a function using kernel methods to indirectly estimate the gradient. We then demonstrate how to deploy the learned input directions to improve the predictive performance of local regression models by ``undoing" the active subspace. Finally, we develop notions of sensitivities which are local to certain parts of the input space, which we then use to develop a Bayesian optimization algorithm which can exploit locally important directions.en
dc.description.abstractgeneralIncreasingly, scientists and engineers developing new understanding or products rely on computers to simulate complex phenomena. Sometimes, these computer programs are so detailed that the amount of time they take to run becomes a serious issue. Surrogate modeling is the problem of trying to predict a computer experiment's result without having to actually run it, on the basis of having observed the behavior of similar simulations. Typically, computer experiments have different settings which induce different behavior. When there are many different settings to tweak, typical surrogate modeling approaches can struggle. In this dissertation, we develop a technique for deciding which input settings, or even which combinations of input settings, we should focus our attention on when trying to predict the output of the computer experiment. We then deploy this technique both to prediction of computer experiment outputs as well as to trying to find which of the input settings yields a particular desired result.en
dc.description.degreeDoctor of Philosophyen
dc.format.mediumETDen
dc.identifier.othervt_gsexam:31835en
dc.identifier.urihttp://hdl.handle.net/10919/104683en
dc.publisherVirginia Techen
dc.rightsIn Copyrighten
dc.rights.urihttp://rightsstatements.org/vocab/InC/1.0/en
dc.subjectSensitivity Analysisen
dc.subjectNonparametric Regressionen
dc.subjectDimension Reductionen
dc.subjectDerivative Free Optimizationen
dc.titleGradient-Based Sensitivity Analysis with Kernelsen
dc.typeDissertationen
thesis.degree.disciplineStatisticsen
thesis.degree.grantorVirginia Polytechnic Institute and State Universityen
thesis.degree.leveldoctoralen
thesis.degree.nameDoctor of Philosophyen

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Wycoff_NB_D_2021.pdf
Size:
2.58 MB
Format:
Adobe Portable Document Format