Dual Variable Metric Algorithms for Constrained Optimization
We present a class of algorithms for solving constrained optimization problems. In the algorithm non-negatively constrained quadratic programming subproblems are iteratively solved to obtain estimates of Lagrange multipliers and with these estimates a sequence of points which converges to the solution is generated. To achieve a superlinear rate of convergence the matrix appearing in the subproblem is required to be an approximate inverse of the Hessian of the Lagrangian. Some well-known variable metric updates such as the BFGS update are employed to generate the matrix and the resulting algorithm converges locally with a superlinear rate. When the penalty Lagrangian developed by Hestenes, Powell and Rockafellar is incorporated in the algorithm, it turns out to be closely related to the recently developed the method of multipliers. Unlike the method of multipliers, our algorithm possesses a superlinear rate of convergence even without requiring a penalty parameter goint to infinity and therefore avoids the numerical instability so caused.
computer science; technical report
Previously Published As