Cornell University
Library
Cornell UniversityLibrary

eCommons

Help
Log In(current)
  1. Home
  2. Cornell Computing and Information Science
  3. Computer Science
  4. Computer Science Technical Reports
  5. Sequential Estimation Techniques for Quasi-Newton Algorithms

Sequential Estimation Techniques for Quasi-Newton Algorithms

File(s)
75-227.pdf (4.9 MB)
75-227.ps (2.15 MB)
Permanent Link(s)
https://hdl.handle.net/1813/6862
Collections
Computer Science Technical Reports
Author
Thomas, Stephen Walter
Abstract

Let $F$ be a mapping from real $n$-dimensional Euclidean space into itself. In terms of solving for a zero of $F$, the quasi-Newton algorithms are based on the generalized Newton iteration $x_{k+1} = x_{k} - B_{k}^{-1}F(x_{k})$, where ${B_{k}}$ is a sequence of nonsingular matrices. In this thesis some geometrical sequential estimation techniques are applied to the calculation of a $B_{k}$ as an estimate for the Jacobian matrix $F'(x_{k})$. The resulting estimators manifest themselves as either rank-one or rank-two, symmetric updates for $B_{k}$, together with an update for a matrix which is descriptive of the error $E_{k} = B_{k} - F'(x_{k})$. These updates are in fact shown to produce optimal estimates in the sense that they minimize the set size of a certain bound for the error matrix $E_{k}$. It is shown that the use of the new updates in conjunction with the generalized Newton iteration produces locally and Q-superlinearly convergent algorithms. Moreover, under the requirement that the steps taken by the algorithm satisfy a uniform linear independence condition, it is shown that the R-order of convergence associated with the symmetric update is at least as large as the positive root of $t^{n+1} - t^{n} - 1 = 0$. A similar but somewhat lower rate of convergence bound is proved for the rank-one update. The symmetric update is implemented in an algorithm for unconstrained optimization which employs Powell's dog-leg step direction/length strategy as a globalization technique. Analysis is presented which, under mild conditions on the function to be minimized, shows that the algorithm is globally and Q-superlinearly convergent. The results of some numerical experiments are presented which show that the algorithm's performance compares favorably with Powell's very successful MINFA.

Date Issued
1975-01
Publisher
Cornell University
Keywords
computer science
•
technical report
Previously Published as
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR75-227
Type
technical report

Site Statistics | Help

About eCommons | Policies | Terms of use | Contact Us

copyright © 2002-2026 Cornell University Library | Privacy | Web Accessibility Assistance