Grey-Box Bayesian Optimization: Improving Performance by Looking Inside the Black-Box
No Access Until
Permanent Link(s)
Collections
Other Titles
Author(s)
Abstract
Non-convex time-consuming objectives are often optimized using “black-box” optimization. These approaches assume very little about the objective. While broadly applicable, these approaches typically require more evaluations than methods exploiting more problem structure. In particular, often, we can acquire information about the objective function in ways other than direct evaluation, which is less time-consuming than evaluating the objective directly. This allows us to develop novel Bayesian optimization algorithms that outperform methods that rely only objective function evaluations. In this thesis, we consider three problems: optimization of sum and integrals of expensive-to-evaluate integrands; optimizing hyperparameters for iteratively trained supervised learning machine learning algorithms; and optimizing non-convex functions with a new efficient multistart stochastic gradient descent algorithm.
Journal / Series
Volume & Issue
Description
Sponsorship
Date Issued
Publisher
Keywords
Location
Effective Date
Expiration Date
Sector
Employer
Union
Union Local
NAICS
Number of Workers
Committee Chair
Committee Co-Chair
Committee Member
Bindel, David