Sequential Estimation Techniques for Quasi-Newton Algorithms
Let $F$ be a mapping from real $n$-dimensional Euclidean space into itself. In terms of solving for a zero of $F$, the quasi-Newton algorithms are based on the generalized Newton iteration $x_{k+1} = x_{k} - B_{k}^{-1}F(x_{k})$, where ${B_{k}}$ is a sequence of nonsingular matrices. In this thesis some geometrical sequential estimation techniques are applied to the calculation of a $B_{k}$ as an estimate for the Jacobian matrix $F'(x_{k})$. The resulting estimators manifest themselves as either rank-one or rank-two, symmetric updates for $B_{k}$, together with an update for a matrix which is descriptive of the error $E_{k} = B_{k} - F'(x_{k})$. These updates are in fact shown to produce optimal estimates in the sense that they minimize the set size of a certain bound for the error matrix $E_{k}$. It is shown that the use of the new updates in conjunction with the generalized Newton iteration produces locally and Q-superlinearly convergent algorithms. Moreover, under the requirement that the steps taken by the algorithm satisfy a uniform linear independence condition, it is shown that the R-order of convergence associated with the symmetric update is at least as large as the positive root of $t^{n+1} - t^{n} - 1 = 0$. A similar but somewhat lower rate of convergence bound is proved for the rank-one update. The symmetric update is implemented in an algorithm for unconstrained optimization which employs Powell's dog-leg step direction/length strategy as a globalization technique. Analysis is presented which, under mild conditions on the function to be minimized, shows that the algorithm is globally and Q-superlinearly convergent. The results of some numerical experiments are presented which show that the algorithm's performance compares favorably with Powell's very successful MINFA.