An Unconstrained Optimization Algorithm Which Uses Function and Gradient Values
Permanent Link(s)
Collections
Author
Dennis, John E., Jr.
Mei, Howell Hung-Wei
Abstract
A new method for unconstrained optimization is presented. It consists of a modification of Powell's 1970 dogleg strategy with the approximate Hessian given by Davidson's 1975 updating scheme which uses the projections of $\triangle x$ and $\triangle g$ in updating H and G and optimizes the condition number of $H^{-1}H_{+}$. This new algorithm performs well without Powell's special iterations and singularity safeguards. Only symmetric and positive definite updates to the Hessian are used.
Date Issued
1975-06
Publisher
Cornell University
Keywords
Previously Published as
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR75-246
Type
technical report