Variational Inference with Hamiltonian Monte Carlo

09/26/2016
by   Christopher Wolf, et al.
0

Variational inference lies at the core of many state-of-the-art algorithms. To improve the approximation of the posterior beyond parametric families, it was proposed to include MCMC steps into the variational lower bound. In this work we explore this idea using steps of the Hamiltonian Monte Carlo (HMC) algorithm, an efficient MCMC method. In particular, we incorporate the acceptance step of the HMC algorithm, guaranteeing asymptotic convergence to the true posterior. Additionally, we introduce some extensions to the HMC algorithm geared towards faster convergence. The theoretical advantages of these modifications are reflected by performance improvements in our experimental results.

READ FULL TEXT

page 1

page 2

page 3

page 4

10/23/2014

Markov Chain Monte Carlo and Variational Inference: Bridging the Gap

Recent advances in stochastic gradient variational inference have made i...
06/28/2022

Reconstructing the Universe with Variational self-Boosted Sampling

Forward modeling approaches in cosmology have made it possible to recons...
07/08/2021

MCMC Variational Inference via Uncorrected Hamiltonian Annealing

Given an unnormalized target distribution we want to obtain approximate ...
08/09/2021

Pathfinder: Parallel quasi-Newton variational inference

We introduce Pathfinder, a variational method for approximately sampling...
05/16/2022

Ergodic variational flows

This work presents a new class of variational family – ergodic variation...
03/05/2022

Recursive Monte Carlo and Variational Inference with Auxiliary Variables

A key challenge in applying Monte Carlo and variational inference (VI) i...
10/16/2018

Metropolis-Hastings view on variational inference and adversarial training

In this paper we propose to view the acceptance rate of the Metropolis-H...