Log In Sign Up

Toward Unlimited Self-Learning Monte Carlo with Annealing Process Using VAE's Implicit Isometricity

by   Yuma Ichikawa, et al.

Self-learning Monte Carlo (SLMC) methods are recently proposed to accelerate Markov chain Monte Carlo (MCMC) methods by using a machine learning model.With generative models having latent variables, SLMC methods realize efficient Monte Carlo updates with less autocorrelation. However, SLMC methods are difficult to directly apply to multimodal distributions for which training data are difficult to obtain. In this paper, we propose a novel SLMC method called the “annealing VAE-SLMC" to drastically expand the range of applications. Our VAE-SLMC utilizes a variational autoencoder (VAE) as a generative model to make efficient parallel proposals independent of any previous state by applying the theoretically derived implicit isometricity of the VAE. We combine an adaptive annealing process to the VAE-SLMC, making our method applicable to the cases where obtaining unbiased training data is difficult in practical sense due to slow mixing. We also propose a parallel annealing process and an exchange process between chains to make the annealing operation more precise and efficient. Experiments validate that our method can proficiently obtain unbiased samples from multiple multimodal toy distributions and practical multimodal posterior distributions, which is difficult to achieve with the existing SLMC methods.


page 1

page 2

page 3

page 4


Learning Model Reparametrizations: Implicit Variational Inference by Fitting MCMC distributions

We introduce a new algorithm for approximate inference that combines rep...

Finite Sample Complexity of Sequential Monte Carlo Estimators on Multimodal Target Distributions

We prove finite sample complexities for sequential Monte Carlo (SMC) alg...

An Annealed Sequential Monte Carlo Method for Bayesian Phylogenetics

The estimation of the probability of the data under a given evolutionary...

Quasi-symplectic Langevin Variational Autoencoder

Variational autoencoder (VAE) as one of the well investigated generative...

Generating Data using Monte Carlo Dropout

For many analytical problems the challenge is to handle huge amounts of ...

Conditional Inference in Pre-trained Variational Autoencoders via Cross-coding

Variational Autoencoders (VAEs) are a popular generative model, but one ...