Variational Refinement for Importance Sampling Using the Forward Kullback-Leibler Divergence

06/30/2021
by   Ghassen Jerfel, et al.
2

Variational Inference (VI) is a popular alternative to asymptotically exact sampling in Bayesian inference. Its main workhorse is optimization over a reverse Kullback-Leibler divergence (RKL), which typically underestimates the tail of the posterior leading to miscalibration and potential degeneracy. Importance sampling (IS), on the other hand, is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures. The quality of IS crucially depends on the choice of the proposal distribution. Ideally, the proposal distribution has heavier tails than the target, which is rarely achievable by minimizing the RKL. We thus propose a novel combination of optimization and sampling techniques for approximate Bayesian inference by constructing an IS proposal distribution through the minimization of a forward KL (FKL) divergence. This approach guarantees asymptotic consistency and a fast convergence towards both the optimal IS estimator and the optimal variational approximation. We empirically demonstrate on real data that our method is competitive with variational boosting and MCMC.

READ FULL TEXT
research
09/17/2019

Refined α-Divergence Variational Inference via Rejection Sampling

We present an approximate inference method, based on a synergistic combi...
research
10/12/2022

On Divergence Measures for Bayesian Pseudocoresets

A Bayesian pseudocoreset is a small synthetic dataset for which the post...
research
05/31/2022

Parallel Tempering With a Variational Reference

Sampling from complex target distributions is a challenging task fundame...
research
05/27/2022

Pseudo-Mallows for Efficient Probabilistic Preference Learning

We propose the Pseudo-Mallows distribution over the set of all permutati...
research
06/21/2021

Nested Variational Inference

We develop nested variational inference (NVI), a family of methods that ...
research
04/18/2017

Stein Variational Adaptive Importance Sampling

We propose a novel adaptive importance sampling algorithm which incorpor...
research
10/12/2022

Importance Sampling Methods for Bayesian Inference with Partitioned Data

This article presents new methodology for sample-based Bayesian inferenc...

Please sign up or login with your details

Forgot password? Click here to reset