MCMC Variational Inference via Uncorrected Hamiltonian Annealing

07/08/2021
by   Tomas Geffner, et al.
0

Given an unnormalized target distribution we want to obtain approximate samples from it and a tight lower bound on its (log) normalization constant log Z. Annealed Importance Sampling (AIS) with Hamiltonian MCMC is a powerful method that can be used to do this. Its main drawback is that it uses non-differentiable transition kernels, which makes tuning its many parameters hard. We propose a framework to use an AIS-like procedure with Uncorrected Hamiltonian MCMC, called Uncorrected Hamiltonian Annealing. Our method leads to tight and differentiable lower bounds on log Z. We show empirically that our method yields better performances than other competing approaches, and that the ability to tune its parameters using reparameterization gradients may lead to large performance improvements.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/29/2018

Hamiltonian Variational Auto-Encoder

Variational Auto-Encoders (VAEs) have become very popular techniques to ...
research
09/26/2016

Variational Inference with Hamiltonian Monte Carlo

Variational inference lies at the core of many state-of-the-art algorith...
research
10/31/2019

Energy-Inspired Models: Learning with Sampler-Induced Distributions

Energy-based models (EBMs) are powerful probabilistic models, but suffer...
research
09/02/2020

Quasi-symplectic Langevin Variational Autoencoder

Variational autoencoder (VAE) as one of the well investigated generative...
research
09/07/2017

A Tight Lower Bound for Counting Hamiltonian Cycles via Matrix Rank

For even k, the matchings connectivity matrix M_k encodes which pairs of...
research
12/22/2021

Surrogate Likelihoods for Variational Annealed Importance Sampling

Variational inference is a powerful paradigm for approximate Bayesian in...

Please sign up or login with your details

Forgot password? Click here to reset