Superlinearly Convergent Variable Metric Algorithms for General Nonlinear Programming Problems
In this paper variable metric algorithms are extended to solve general nonlinear programming problems. In the algorithm we iteratively solve a linearly constrained quadratic program which contains an estimate of the Hessian of the Lagrangian. We suggest the variable metric updates for the estimates of the Hessians and justify our suggestion by showing that, when some well known update such as the Davidon-Fletcher-Powell update are so employed, the algorithm converges locally with a superlinear rate. Our algorithm is in a sense a natural extension of the variable metric algorithm to the constrained optimization and this extension offers us not only a class of effective algorithms in nonlinear programming but also a unified treatment of constrained and unconstrained optimization in the variable metric approach.
computer science; technical report
Previously Published As