JavaScript is disabled for your browser. Some features of this site may not work without it.
Some Extensions on the Reach of First-Order Optimization Theory

Author
Grimmer, Benjamin David
Abstract
This thesis concerns the foundations of first-order optimization theory. In recent years, these methods have found tremendous success in a wide range of domains due to their incredibly scalable nature. We aim to extend the reach of first-order optimization guarantees, lessening the gap between practice and theory, as well as enabling the design of new algorithms. We show that many of the typical assumptions in the theory literature are not fundamentally needed. For example, analysis in nonsmooth optimization typically relies on Lipschitz continuity, convexity, and carefully chosen stepsize parameters. Chapters 2-4 show that classic methods can be applied and analyzed without relying on these structures. Then Chapters 5-8 consider reformulations of optimization problems that further extend the reach of first-order methods. Constrained optimization can be reformulated to avoid orthogonal projections, generic optimization problems can be reformulated to possess H\"older growth/error bounds, and minimax optimization problems can be smoothed and convexified/concavified. Together these results challenge the limitations on which problems are historically considered tractable for analysis sake.
Description
305 pages
Date Issued
2021-08Subject
Convergence Guarantees; Convexity; First-Order Optimization; Nonconvexity
Committee Chair
Renegar, James
Committee Member
Davis, Damek Shea; Lewis, Adrian S.
Degree Discipline
Operations Research and Information Engineering
Degree Name
Ph. D., Operations Research and Information Engineering
Degree Level
Doctor of Philosophy
Rights
Attribution-ShareAlike 4.0 International
Type
dissertation or thesis
Except where otherwise noted, this item's license is described as Attribution-ShareAlike 4.0 International