On The Local Convergence of The Byrd-Schnabel Algorithm For Constrained Optimization
Most reduced Hessian methods for equality constrained problems use a basis for the null space of the matrix of constraint gradients and posess superlinearly convergent rates under the assumption of continuity of the basis. However, computing a continuously varying null space basis is not straightforward. Byrd and Schnabel [2] propose an alternative implementation that is independent of the choice of null space basis, thus obviating the need for a continuously varying null space basis. In this note we prove that the primary sequence of iterates generated by one version of their algorithm exhibits a local 2-step Q-superlinear convergence rate. We also establish that a sequence of "midpoints", in a closely related algorithm, is (1-step) Q-superlinearly convergent. Key words: constrained optimization, null space, superlinear convergence, reduced Hessian.