On the Convergence of a Class of Derivative-free Minimization Algorithms

Files
TR Number
CS75005-R
Date
1975
Journal Title
Journal ISSN
Volume Title
Publisher
Department of Computer Science, Virginia Polytechnic Institute & State University
Abstract

A convergence analysis is presenzed for a general class of derivative-free algorithms for minimizing a function f(x) whose analytic form of the gradient and the Hessian is impractical to obtain. The class of algorithms accepts finite difference approximation to the gradient with step-sizes chosen according to the following rule: if x, 2 are two successive iterate points and h, h are the corresponding step-size, then two conditions are required. The algorithms also maintain an approximation to the second derivative matrix and require the change in x made by each iteration is subject to a bound that is also revised automatically. The convergence theorems have the features that the starting point xl needs not be close to the true solution and f(x) needs not be convex. Furthermore, despite of the fact that the second derivative approximation may not converge to the true Hessian at the solution, the rate of convergence is still Q-superlinear. The theory is also shown to be applicable to a modification of Powell's dog-leg algorithm.

Description
Keywords
Citation