JavaScript is disabled for your browser. Some features of this site may not work without it.
SMOOTH QUASI-NEWTON METHODS FOR NONSMOOTH OPTIMIZATION

Author
Guo, Jiayi
Abstract
The success of Newton’s method for smooth optimization, when Hessians are available, motivated the idea of quasi-Newton methods, which approximate Hessians in response to changes in gradients and result in superlinear convergence on smooth functions. Sporadic informal observations over several decades (and more formally in recent work of Lewis and Overton) suggest that such methods also seem to work surprisingly well on nonsmooth functions. This thesis explores this phenomenon from several perspectives. First, Powell’s fundamental 1976 convergence proof for the popular Broyden-Fletcher- Goldfarb-Shanno (BFGS) quasi-Newton method for smooth convex functions in fact extends to some nonsmooth settings. Secondly, removing the influence of linesearch techniques and introducing linesearch-free quasi-Newton approaches (including a version of Shor’s R algorithm), shows in particular how repeated quasi-Newton updating at a single point can serve as a separation technique for convex sets. Lastly, an experimental comparison, in the nonsmooth setting, of the two most popular smooth quasi-Newton updates, BFGS and Symmetric Rank-One, emphasizes the power of the BFGS update.
Date Issued
2018-05-30Subject
Operations research; Optimization; BFGS; Convex; Nonsmooth; Quasi-Newton
Committee Chair
Lewis, Adrian S.
Committee Member
Frazier, Peter; Bindel, David S.
Degree Discipline
Operations Research
Degree Name
Ph. D., Operations Research
Degree Level
Doctor of Philosophy
Rights
Attribution 4.0 International
Rights URI
Type
dissertation or thesis
Except where otherwise noted, this item's license is described as Attribution 4.0 International