Optimal Conditioning in the Convex Class of Rank Two Updates
Schnabel, Robert B.
Davidson's new quasi-Newton optimization algorithm selects the new inverse Hessian approximation H at each step to be the "optimally conditioned" member of a certain one-parameter class of rank two updates to the last inverse Hessian approximation H. In this paper, we show that virtually the same goals of conditioning can be achieved while restricting H to the convex class of updates. We therefore suggest that Davidson's algorithms using optimal conditioning, restrict the choice of H to members of the convex class.
computer science; technical report
Previously Published As