JavaScript is disabled for your browser. Some features of this site may not work without it.
On The Local Convergence of The Byrd-Schnabel Algorithm For Constrained Optimization

Author
Coleman, Thomas F.; Liao, Ai-Ping
Abstract
Most reduced Hessian methods for equality constrained problems use a basis for the null space of the matrix of constraint gradients and posess superlinearly convergent rates under the assumption of continuity of the basis. However, computing a continuously varying null space basis is not straightforward. Byrd and Schnabel [2] propose an alternative implementation that is independent of the choice of null space basis, thus obviating the need for a continuously varying null space basis. In this note we prove that the primary sequence of iterates generated by one version of their algorithm exhibits a local 2-step Q-superlinear convergence rate. We also establish that a sequence of "midpoints", in a closely related algorithm, is (1-step) Q-superlinearly convergent. Key words: constrained optimization, null space, superlinear convergence, reduced Hessian.
Date Issued
1992-02Publisher
Cornell University
Subject
computer science; technical report
Previously Published As
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR92-1268
Type
technical report