Blackbox optimization, Nonsmooth Structure, and Survey Descent
No Access Until
Permanent Link(s)
Collections
Other Titles
Author(s)
Abstract
First-order, blackbox optimization methods aim to minimize objective functions using only function and gradient evaluations. On smooth, strongly-convex objectives; classical results ensure the linear convergence of these methods relative to the number of function and gradient evaluations. An analogous nonsmooth theory is challenging due to a lack of reliable, local models. As a consequence, convergence guarantees for nonsmooth optimization procedures have generally remained sublinear. In this dissertation, I propose a "survey descent" method - inspired by a new max-of-smooth model - where an iteratively-updated "survey" of points move closer-and-closer to the objective minimizer while guided by (but never touching) an "active manifold" that lies at the boundary of different smooth subdomains of the nonsmooth objective. I prove that, when the nonsmooth objective possesses a "finite-max" structure, survey descent exhibits local linear convergence; and I demonstrate empirically that survey descent holds promise even beyond the finite-max case. I also propose a blackbox initialization procedure for survey descent that finds an initial survey containing exactly one point from each smooth subdomain of a finite-max objective. Of separate interest, I show how the minimization of the numerical radii of square matrices frequently generates solutions that land on the aforementioned active manifold - an interesting example for an "identification" property of proximal operators, a common subroutine used within nonsmooth optimization methods.
Journal / Series
Volume & Issue
Description
Sponsorship
Date Issued
Publisher
Keywords
Location
Effective Date
Expiration Date
Sector
Employer
Union
Union Local
NAICS
Number of Workers
Committee Chair
Committee Co-Chair
Committee Member
Chen, Yudong