A dynamic programming approach to single attribute process control

TR Number
Date
1974
Journal Title
Journal ISSN
Volume Title
Publisher
Virginia Polytechnic Institute and State University
Abstract

This thesis focuses on the economic design of process control procedures for attributes sampling. The process is modeled as a continuous time, discrete space stochastic process which possesses the Markov property, and hence a Markov chain is used to describe its behavior.

Two models are developed. The first model has fixed values of the decision variables and is optimized using the pattern search procedure. The second model is a dynamic formulation. The optimal decision policies developed using this model vary with the expected state of the process.

Several cost components are considered in the mathematical development of each model. They are: the cost of sampling, the cost of process adjustment, and the cost of producing a defective unit. The cost of a false indication of the process state is also included in the fixed parameter model.

Computer programs, written in Fortran IV are developed and used to find the optimal system designs. Example problems are presented to illustrate both of the models. The dynamic programming model is shown to offer considerable economic improvement over the steady state model in all of the examples.

Description
Keywords
Citation
Collections