Some Extensions on the Reach of First-Order Optimization Theory
Grimmer, Benjamin David
This thesis concerns the foundations of first-order optimization theory. In recent years, these methods have found tremendous success in a wide range of domains due to their incredibly scalable nature. We aim to extend the reach of first-order optimization guarantees, lessening the gap between practice and theory, as well as enabling the design of new algorithms. We show that many of the typical assumptions in the theory literature are not fundamentally needed. For example, analysis in nonsmooth optimization typically relies on Lipschitz continuity, convexity, and carefully chosen stepsize parameters. Chapters 2-4 show that classic methods can be applied and analyzed without relying on these structures. Then Chapters 5-8 consider reformulations of optimization problems that further extend the reach of first-order methods. Constrained optimization can be reformulated to avoid orthogonal projections, generic optimization problems can be reformulated to possess H\"older growth/error bounds, and minimax optimization problems can be smoothed and convexified/concavified. Together these results challenge the limitations on which problems are historically considered tractable for analysis sake.
Convergence Guarantees; Convexity; First-Order Optimization; Nonconvexity
Davis, Damek Shea; Lewis, Adrian S.
Operations Research and Information Engineering
Ph. D., Operations Research and Information Engineering
Doctor of Philosophy
Attribution-ShareAlike 4.0 International
dissertation or thesis
Except where otherwise noted, this item's license is described as Attribution-ShareAlike 4.0 International