Show simple item record

dc.contributor.authorShaby, Benjaminen_US
dc.date.accessioned2009-10-13T20:28:48Z
dc.date.available2009-10-13T20:28:48Z
dc.date.issued2009-10-13T20:28:48Z
dc.identifier.otherbibid: 6711595
dc.identifier.urihttps://hdl.handle.net/1813/13950
dc.description.abstractEach of the three chapters included here attempts to meet a different computing challenge that presents itself in the context in Bayesian statistics. The first deals with the difficulty of evaluating the computationally-expensive likelihood functions that arise from models that include Gaussian random field components. This challenge can be mitigated by introducing sparsity into the covariance matrix in a principled way. Chapter 1 analyzes the properties of estimates, including Bayesian-like estimates, based on this "tapering" strategy. The second challenge is how to design good MCMC samplers. Chapter 2 explores an adaptive Metropolis Hastings sampler, motivating why such adaptation is needed, and demonstrating its efficacy. Chapter 2 concludes by comparing the efficiency of adaptively-tuned Metropolis samplers to three very popular MCMC algorithms, demonstrating that besides having the attractive properties of simplicity and almost unlimited flexibility, adaptively-tuned Metropolis samplers are also extremely efficient. Finally, the third challenge is endowing hierarchical models with the ability to represent conditional mean structures that are complicated, unknown functions of very many covariates. Chapter 3 describes how this is accomplished through the HEBBRU framework, whereby data mining methods are embedded into hierarchical models and fit using an approximate Gibbs sampler.en_US
dc.language.isoen_USen_US
dc.subjectBayesian Computationsen_US
dc.titleTools For Hard Bayesian Computationsen_US
dc.typedissertation or thesisen_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record

Statistics