Methods of non-linear least squares estimation
This thesis is a summary of some of the published techniques for determining the least squares estimates of the parameters in non-linear statistical models. It is first briefly indicated why non-linear models require iterative numerical estimating procedures, unlike the straightforward algebraic estimation possibly in the linear case. Then the following six general numerical approaches are discussed with the emphasis being as much on the application of the algorithms as on their theoretical bases:
- linearization techniques - use of a linear approximation to the model (also called Gauss-Newton or Taylor Series methods),
- gradient techniques - use of the negative gradient of the sum of squares function (includes the well-known Method of Steepest Descent),
- combination techniques - attempts to gain an improved rate of convergence through a combination of the above two procedures,
- the methods of parallel tangents - a geometrical approach,
- methods of conjugate directions - another geometrical approach,
- a simplex method - based upon successive sets of guesses.
A large bibliography is also included as a guide for further study.