Deep Markov Chain Monte Carlo

10/13/2019
by   Babak Shahbaba, et al.
0

We propose a new computationally efficient sampling scheme for Bayesian inference involving high dimensional probability distributions. Our method maps the original parameter space into a low-dimensional latent space, explores the latent space to generate samples, and maps these samples back to the original space for inference. While our method can be used in conjunction with any dimension reduction technique to obtain the latent space, and any standard sampling algorithm to explore the low-dimensional space, here we specifically use a combination of auto-encoders (for dimensionality reduction) and Hamiltonian Monte Carlo (HMC, for sampling). To this end, we first run an HMC to generate some initial samples from the original parameter space, and then use these samples to train an auto-encoder. Next, starting with an initial state, we use the encoding part of the autoencoder to map the initial state to a point in the low-dimensional latent space. Using another HMC, this point is then treated as an initial state in the latent space to generate a new state, which is then mapped to the original space using the decoding part of the auto-encoder. The resulting point can be treated as a Metropolis-Hasting (MH) proposal, which is either accepted or rejected. While the induced dynamics in the parameter space is no longer Hamiltonian, it remains time reversible, and the Markov chain could still converge to the canonical distribution using a volume correction term. Dropping the volume correction step results in convergence to an approximate but reasonably accurate distribution. The empirical results based on several high-dimensional problems show that our method could substantially reduce the computational cost of Bayesian inference.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2021

Markov Chain Generative Adversarial Neural Networks for Solving Bayesian Inverse Problems in Physics Applications

In the context of solving inverse problems for physics applications with...
research
06/25/2020

Stratified stochastic variational inference for high-dimensional network factor model

There has been considerable recent interest in Bayesian modeling of high...
research
03/09/2020

Manifold lifting: scaling MCMC to the vanishing noise regime

Standard Markov chain Monte Carlo methods struggle to explore distributi...
research
06/28/2022

Reconstructing the Universe with Variational self-Boosted Sampling

Forward modeling approaches in cosmology have made it possible to recons...
research
05/20/2023

Normalizing flow sampling with Langevin dynamics in the latent space

Normalizing flows (NF) use a continuous generator to map a simple latent...
research
08/07/2023

Generative Models for the Deformation of Industrial Shapes with Linear Geometric Constraints: model order and parameter space reductions

Real-world applications of computational fluid dynamics often involve th...
research
11/24/2022

Certified data-driven physics-informed greedy auto-encoder simulator

A parametric adaptive greedy Latent Space Dynamics Identification (gLaSD...

Please sign up or login with your details

Forgot password? Click here to reset