Krahnke, Andreas2014-03-142014-03-142001-04-30etd-05032001-183707http://hdl.handle.net/10919/32132In this work we present a new tool for the convergence analysis of numerical optimization methods. It is based on the concepts of the Clarke derivative and set-valued mappings. Our goal is to apply this tool to minimization problems with non-smooth and noisy objective functions. After deriving a necessary condition for minimizers of such functions, we examine two unconstrained optimization routines. First, we prove new convergence theorems for Implicit Filtering and General Pattern Search. Then we show how these results can be used in practice, by executing some numerical computations.enIn CopyrightImplicit FilteringNoisy Objective FunctionPattern SearchNon-smooth OptimizationSet-Valued MappingClarke DerivativeThe Clarke Derivative and Set-Valued Mappings in the Numerical Optimization of Non-Smooth, Noisy FunctionsThesishttp://scholar.lib.vt.edu/theses/available/etd-05032001-183707/