A controller design procedure for nonlinear stochastic systems

TR Number
Date
1984
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Polytechnic Institute and State University
Abstract

An improved method for designing controllers for nonlinear stochastic systems is developed and analyzed. The resulting controller consists of a nonlinear control law coupled with an adaptive state estimator.

The nonlinear control law is developed first. Using Taylor series expansion, linear approximations to the nonlinear systems are generated at selected points in the operating region. Then a control law which will produce the desired response is developed for each linearized configuration using conventional techniques for linear systems. The resulting control law parameters are treated as tabulated values from a set of unknown continuous functions of the nonlinear system parameters. These unknown functions are approximated at all points in the operating region by fitting curves to the tabulated data. The stability and convergence aspects of this nonlinear control law are analyzed in detail, with several derivations given and theorems proved. Two examples are given to illustrate the design procedure and evaluate its performance.

The design procedure is extended to stochastic systems by incorporating a suitable state estimator. Two members of the class known as partitioned adaptive estimators (PAE's) are evaluated and their performance compared. The formulation known as the modified semi-Markov PAE is shown to be superior. The design, execution, and analysis of the experiments comprising the evaluation are discussed in detail, with particular attention given to correlating the performance of the estimators with the behavior of the weighting coefficients.

Numerous figures and tables which amplify the discussions, along with some suggestions for further research, are also included.

Description
Keywords
Citation