Scalable Gaussian Processes and Bayesian Optimization with Application to Hyperparameter Tuning
dc.contributor.author | Zhu, Xinran | |
dc.contributor.chair | Bindel, David | en_US |
dc.contributor.committeeMember | Townsend, Alex | en_US |
dc.contributor.committeeMember | Weinberger, Kilian | en_US |
dc.date.accessioned | 2024-11-05T19:47:44Z | |
dc.date.available | 2024-11-05T19:47:44Z | |
dc.date.issued | 2024-05 | |
dc.description | 210 pages | en_US |
dc.description.abstract | This dissertation delves into the advanced realms of Gaussian Processes (GPs) and Bayesian Optimization (BO), presenting novel methodologies that enhance their performance and applicability. GPs, as a principled probabilistic approach, are powerful in modeling complex and noisy functions due to their non-parametric nature and capability for uncertainty quantification. However, exact GPs become intractable for large datasets since the computational cost scales cubically with the size of the dataset. In particular, this dissertation focuses on improving variational GPs, which is able to handle large-scale data by sparsifying the model via inducing points and approximating the posterior. Despite advances, variational GPs still may require many inducing points (and significant computational costs) to achieve good accuracy, a gap this dissertation aims to bridge.This dissertation also studies efficient computational methods for Bayesian transformed GPs (BTG), which is particularly useful when the Gaussian assumption is not satisfied and data is limited. Furthermore, the dissertation explores BO as a method for optimizing complex and expensive objective functions, with an emphasis on its application in hyperparameter tuning. By leveraging the probabilistic modeling strengths of GPs, BO can efficiently traverse the hyperparameter space, thus reducing the need for extensive model evaluations. Through the introduction of novel algorithms and methodologies, this research not only enhances the performance of BTG and variational GPs but also broadens the scope of BO in hyperparameter tuning. | en_US |
dc.identifier.doi | https://doi.org/10.7298/ae3g-pw33 | |
dc.identifier.other | Zhu_cornellgrad_0058F_14151 | |
dc.identifier.other | http://dissertations.umi.com/cornellgrad:14151 | |
dc.identifier.uri | https://hdl.handle.net/1813/116053 | |
dc.language.iso | en | |
dc.rights | Attribution 4.0 International | * |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | * |
dc.title | Scalable Gaussian Processes and Bayesian Optimization with Application to Hyperparameter Tuning | en_US |
dc.type | dissertation or thesis | en_US |
dcterms.license | https://hdl.handle.net/1813/59810.2 | |
thesis.degree.discipline | Applied Mathematics | |
thesis.degree.grantor | Cornell University | |
thesis.degree.level | Doctor of Philosophy | |
thesis.degree.name | Ph. D., Applied Mathematics |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- Zhu_cornellgrad_0058F_14151.pdf
- Size:
- 3.34 MB
- Format:
- Adobe Portable Document Format