JavaScript is disabled for your browser. Some features of this site may not work without it.
Advances in Adaptive and Deep Bayesian State-Space Models

Author
Wu, Haoxuan
Abstract
This work explores modeling and forecasting of time series data using a Bayesian state-space approach. Three sections are included, each describing a methodology based on Bayesian state-space framework for dealing with different challenges. The first methodology adds shrinkage and threshold estimation into a dynamic linear model to identify changepoints. The second methodology focuses on decoupling of shrinkage and modeling to provide posterior summaries. The third methodology incorporates a recurrent neural network with a state-space framework to produce robust and accurate forecasting of large datasets. While these methodologies are broadly applicable, an unifying theme is the usage of global-local shrinkage priors. In Chapter 2, we introduce global-local shrinkage priors into a Bayesian dynamic linear model to adaptively estimate both changepoints and local outliers in a novel model we call Adaptive Bayesian Changepoints with Outliers (ABCO). We utilize a state-space approach to identify a dynamic signal in the presence of outliers and measurement error with stochastic volatility. This setup provides a flexible framework to detect unspecified changepoints in complex series, such as those with large interruptions in local trends, with robustness to outliers and heteroskedastic noise. In Chapter 3, we introduce a new approach for decoupling trends (drift) and changepoints (shifts) in time series. Our locally adaptive model-based approach for robustly decoupling combines Bayesian trend filtering and machine learning based regularization. The proposed decoupling approach incorporates the strengths of both methods, i.e. the flexibility of Bayesian DLMs with the hard thresholding property of penalized likelihood estimators, to provide changepoint analysis in complex, modern settings. The proposed framework is outlier robust and can identify a variety of changes, including in mean and slope. In Chapter 4, we introduce a new version of deep state-space models (DSSMs) that combines a recurrent neural network with a state-space framework to forecast time series data. The model estimates the observed series as functions of latent variables that evolve non-linearly through time. Our paper focus on producing interpretable latent parameters with two key modifications. First, we simplify the predictive decoder by restricting the response variables to be a linear transformation of the latent variables plus some noise. Second, we utilize shrinkage priors on the latent variables to reduce redundancy and improve robustness. These changes make the latent variables much easier to understand and allow us to interpret the resulting latent variables as random effects in a linear mixed model.
Description
155 pages
Date Issued
2022-08Subject
Deep Learning; Dynamic Linear Model; Structural Change; Time Series
Committee Chair
Matteson, David
Committee Member
Ruppert, David; Joachims, Thorsten
Degree Discipline
Statistics
Degree Name
Ph. D., Statistics
Degree Level
Doctor of Philosophy
Rights
Attribution 4.0 International
Rights URI
Type
dissertation or thesis
The following license files are associated with this item:
Except where otherwise noted, this item's license is described as Attribution 4.0 International