Cornell University
Library
Cornell UniversityLibrary

eCommons

Help
Log In(current)
  1. Home
  2. Cornell University Graduate School
  3. Cornell Theses and Dissertations
  4. Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization

Saddle Avoidance, Asymptotic Normality, and Exponential Acceleration in Nonsmooth Optimization

File(s)
Jiang_cornellgrad_0058F_14308.pdf (5.59 MB)
Permanent Link(s)
https://doi.org/10.7298/4pq5-qe42
https://hdl.handle.net/1813/115936
Collections
Cornell Theses and Dissertations
Author
Jiang, Liwei
Abstract

Optimization-based algorithms are the foundation for empirically successful methods in modern fields, such as artificial intelligence and data science. Although classical optimization theory provides guarantees for functions with smoothness or convexity, a significant portion of modern problems do not possess any of these. Despite the worst-case examples where efficient algorithms are unavailable, typical nonsmoothness arises with a "partly smooth" structure, meaning that they are well-behaved relative to a smooth "active manifold." This thesis develops and analyzes first-order algorithms based on the aforementioned nonsmooth structure. We first develop two regularity conditions describing how subgradients interact with active manifolds and then show that they hold for a broad and generic class of functions. With these cornerstones, we demonstrate that when randomly perturbed or equipped with stochastic noise, subgradient methods only converge to minimizers of generic, Clarke regular semialgebraic problems. When convergence to a certain minimizer is known, we demonstrate that stochastic (projected) subgradient methods have asymptotic normality, making them asymptotically optimal algorithms in the locally minimax sense of Hájek and Le Cam. These findings culminate with a new first-order algorithm—NTDescent—which exhibits local nearly linear convergence on typical nonsmooth functions with quadratic growth. The convergence rate of NTDescent depends only on the function's intrinsic quantities but not the problem's underlying dimension.

Description
310 pages
Date Issued
2024-05
Keywords
asymptotic normality
•
first-order method
•
linear convergence
•
nonsmooth optimization
•
parameter-free
•
saddle point avoidance
Committee Chair
Davis, Damek
Committee Member
Chen, Yudong
Lewis, Adrian
Scheinberg, Katya
Degree Discipline
Operations Research and Information Engineering
Degree Name
Ph. D., Operations Research and Information Engineering
Degree Level
Doctor of Philosophy
Rights
Attribution 4.0 International
Rights URI
https://creativecommons.org/licenses/by/4.0/
Type
dissertation or thesis
Link(s) to Catalog Record
https://newcatalog.library.cornell.edu/catalog/16575442

Site Statistics | Help

About eCommons | Policies | Terms of use | Contact Us

copyright © 2002-2026 Cornell University Library | Privacy | Web Accessibility Assistance