Han, Shih-Ping2007-04-232007-04-231975-03http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR75-233https://hdl.handle.net/1813/7308In this paper variable metric algorithms are extended to solve general nonlinear programming problems. In the algorithm we iteratively solve a linearly constrained quadratic program which contains an estimate of the Hessian of the Lagrangian. We suggest the variable metric updates for the estimates of the Hessians and justify our suggestion by showing that, when some well known update such as the Davidon-Fletcher-Powell update are so employed, the algorithm converges locally with a superlinear rate. Our algorithm is in a sense a natural extension of the variable metric algorithm to the constrained optimization and this extension offers us not only a class of effective algorithms in nonlinear programming but also a unified treatment of constrained and unconstrained optimization in the variable metric approach.1392743 bytes537059 bytesapplication/pdfapplication/postscripten-UScomputer sciencetechnical reportSuperlinearly Convergent Variable Metric Algorithms for General Nonlinear Programming Problemstechnical report