Stacking for Non-mixing Bayesian Computations: The Curse and Blessing of Multimodal Posteriors

06/22/2020
by   Yuling Yao, et al.
11

When working with multimodal Bayesian posterior distributions, Markov chain Monte Carlo (MCMC) algorithms can have difficulty moving between modes, and default variational or mode-based approximate inferences will understate posterior uncertainty. And, even if the most important modes can be found, it is difficult to evaluate their relative weights in the posterior. Here we propose an alternative approach, using parallel runs of MCMC, variational, or mode-based inference to hit as many modes or separated regions as possible, and then combining these using importance sampling based Bayesian stacking, a scalable method for constructing a weighted average of distributions so as to maximize cross-validated prediction utility. The result from stacking is not necessarily equivalent, even asymptotically, to fully Bayesian inference, but it serves many of the same goals. Under misspecified models, stacking can give better predictive performance than full Bayesian inference, hence the multimodality can be considered a blessing rather than a curse. We explore with an example where the stacked inference approximates the true data generating process from the misspecified model, an example of inconsistent inference, and non-mixing samplers. We elaborate the practical implantation in the context of latent Dirichlet allocation, Gaussian process regression, hierarchical model, variational inference in horseshoe regression, and neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2013

Algorithms of the LDA model [REPORT]

We review three algorithms for Latent Dirichlet Allocation (LDA). Two of...
research
03/02/2019

Approximation Properties of Variational Bayes for Vector Autoregressions

Variational Bayes (VB) is a recent approximate method for Bayesian infer...
research
02/17/2023

Piecewise Deterministic Markov Processes for Bayesian Neural Networks

Inference on modern Bayesian Neural Networks (BNNs) often relies on a va...
research
11/27/2019

Bayesian inference based process design and uncertainty analysis of simulated moving bed chromatographic systems

Prominent features of simulated moving bed (SMB) chromatography processe...
research
05/26/2019

Variational Bayes under Model Misspecification

Variational Bayes (VB) is a scalable alternative to Markov chain Monte C...
research
02/28/2018

Approximate Inference for Constructing Astronomical Catalogs from Images

We present a new, fully generative model for constructing astronomical c...
research
11/02/2019

Variational Bayesian inference of hidden stochastic processes with unknown parameters

Estimating hidden processes from non-linear noisy observations is partic...

Please sign up or login with your details

Forgot password? Click here to reset