Cornell University
Library
Cornell UniversityLibrary

eCommons

Help
Log In(current)
  1. Home
  2. Cornell Computing and Information Science
  3. Computer Science
  4. Computer Science Technical Reports
  5. On The Local Convergence of The Byrd-Schnabel Algorithm For Constrained Optimization

On The Local Convergence of The Byrd-Schnabel Algorithm For Constrained Optimization

File(s)
92-1268.ps (188.35 KB)
92-1268.pdf (650.09 KB)
Permanent Link(s)
https://hdl.handle.net/1813/7108
Collections
Computer Science Technical Reports
Author
Coleman, Thomas F.
Liao, Ai-Ping
Abstract

Most reduced Hessian methods for equality constrained problems use a basis for the null space of the matrix of constraint gradients and posess superlinearly convergent rates under the assumption of continuity of the basis. However, computing a continuously varying null space basis is not straightforward. Byrd and Schnabel [2] propose an alternative implementation that is independent of the choice of null space basis, thus obviating the need for a continuously varying null space basis. In this note we prove that the primary sequence of iterates generated by one version of their algorithm exhibits a local 2-step Q-superlinear convergence rate. We also establish that a sequence of "midpoints", in a closely related algorithm, is (1-step) Q-superlinearly convergent. Key words: constrained optimization, null space, superlinear convergence, reduced Hessian.

Date Issued
1992-02
Publisher
Cornell University
Keywords
computer science
•
technical report
Previously Published as
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR92-1268
Type
technical report

Site Statistics | Help

About eCommons | Policies | Terms of use | Contact Us

copyright © 2002-2026 Cornell University Library | Privacy | Web Accessibility Assistance