JavaScript is disabled for your browser. Some features of this site may not work without it.
An Unconstrained Optimization Algorithm Which Uses Function and Gradient Values

Author
Dennis, John E., Jr.; Mei, Howell Hung-Wei
Abstract
A new method for unconstrained optimization is presented. It consists of a modification of Powell's 1970 dogleg strategy with the approximate Hessian given by Davidson's 1975 updating scheme which uses the projections of $\triangle x$ and $\triangle g$ in updating H and G and optimizes the condition number of $H^{-1}H_{+}$. This new algorithm performs well without Powell's special iterations and singularity safeguards. Only symmetric and positive definite updates to the Hessian are used.
Date Issued
1975-06Publisher
Cornell University
Subject
computer science; technical report
Previously Published As
http://techreports.library.cornell.edu:8081/Dienst/UI/1.0/Display/cul.cs/TR75-246
Type
technical report