Between Dec. 23, 2024 and Jan. 3, 2025, eCommons staff will not be available to answer email and will not be able to provide DOIs until after Jan. 6. If you need a DOI for a dataset during this period, consider Dryad or OpenICPSR. If you need support submitting material before the winter break, please contact us by Thursday, Dec. 19 at noon. Thank you!

eCommons

 

Scalable Gaussian Processes and Bayesian Optimization with Application to Hyperparameter Tuning

dc.contributor.authorZhu, Xinran
dc.contributor.chairBindel, Daviden_US
dc.contributor.committeeMemberTownsend, Alexen_US
dc.contributor.committeeMemberWeinberger, Kilianen_US
dc.date.accessioned2024-11-05T19:47:44Z
dc.date.available2024-11-05T19:47:44Z
dc.date.issued2024-05
dc.description210 pagesen_US
dc.description.abstractThis dissertation delves into the advanced realms of Gaussian Processes (GPs) and Bayesian Optimization (BO), presenting novel methodologies that enhance their performance and applicability. GPs, as a principled probabilistic approach, are powerful in modeling complex and noisy functions due to their non-parametric nature and capability for uncertainty quantification. However, exact GPs become intractable for large datasets since the computational cost scales cubically with the size of the dataset. In particular, this dissertation focuses on improving variational GPs, which is able to handle large-scale data by sparsifying the model via inducing points and approximating the posterior. Despite advances, variational GPs still may require many inducing points (and significant computational costs) to achieve good accuracy, a gap this dissertation aims to bridge.This dissertation also studies efficient computational methods for Bayesian transformed GPs (BTG), which is particularly useful when the Gaussian assumption is not satisfied and data is limited. Furthermore, the dissertation explores BO as a method for optimizing complex and expensive objective functions, with an emphasis on its application in hyperparameter tuning. By leveraging the probabilistic modeling strengths of GPs, BO can efficiently traverse the hyperparameter space, thus reducing the need for extensive model evaluations. Through the introduction of novel algorithms and methodologies, this research not only enhances the performance of BTG and variational GPs but also broadens the scope of BO in hyperparameter tuning.en_US
dc.identifier.doihttps://doi.org/10.7298/ae3g-pw33
dc.identifier.otherZhu_cornellgrad_0058F_14151
dc.identifier.otherhttp://dissertations.umi.com/cornellgrad:14151
dc.identifier.urihttps://hdl.handle.net/1813/116053
dc.language.isoen
dc.rightsAttribution 4.0 International*
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/*
dc.titleScalable Gaussian Processes and Bayesian Optimization with Application to Hyperparameter Tuningen_US
dc.typedissertation or thesisen_US
dcterms.licensehttps://hdl.handle.net/1813/59810.2
thesis.degree.disciplineApplied Mathematics
thesis.degree.grantorCornell University
thesis.degree.levelDoctor of Philosophy
thesis.degree.namePh. D., Applied Mathematics

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zhu_cornellgrad_0058F_14151.pdf
Size:
3.34 MB
Format:
Adobe Portable Document Format