##### Abstract

Quadratic minimization problems constitute a significant and important subset of optimization problems. Research has shown that a convex quadratic programming problem can be reduced to a least distance programming problem using a unique and non-singular "linear least distance transformation." A computational algorithm for determining this linear least distance transformation is presented. This algorithm uses a variation of the Cholesky factorization technique for positive definite matrices, since Cholesky factorization is known to be numerically stable.
In solving the least distance programming problem it is found that certain special cases, dealing with the orthogonality property of the constraint matrix, lend themselves to a simple solution procedure. Theorems establishing the optimality of such solutions are presented. It is also found that solutions for least distance programming problems having a row orthogonal constraint matrix are easily obtained. A generalization of this result for solving a special case of quadratic programming is also presented. A QR-transformation is found to be of no use in reducing a general least distance programming problem to a row-orthogonal least distance programming problem. It is found that the column orthogonality property of the constraint matrix of a least distance programming problem does not contribute to the solution of the problem. These special cases, though useful in their own right, have limited application to the solution of a general least distance programming problem,
Using generalized inverses, a new "generalized dual” of the least distance programming problem is derived. The dual problem is solvable by existing quadratic programming techniques. In addition it is shown that the generalized dual problem can be solved by Ben-Israel's modified Newton method. A special case of the generalized dual problem is considered. It is found that the least distance programming problem, having a generalized inverse of the constraint matrix as a partial isometry, is easily solved. Since it is difficult to identify this latter class of matrices) the results have rather limited applications.
Finally, a convergent iterative algorithm for solving the least distance programming problem is presented. This algorithm has wide applications. It is based on a cutting plane technique, and requires the computation of an interior feasible direction at each iteration. The method of Ho and Kashyap may be used to establish this direction. The “e-method” proposed here gives an alternative for computing the feasible directions and generally requires much less computation. A few numerical examples are presented to illustrate these algorithms.