Importance Weighting and Varational Inference

08/27/2018
by   Justin Domke, et al.
0

Recent work used importance sampling ideas for better variational bounds on likelihoods. We clarify the applicability of these ideas to pure probabilistic inference, by showing the resulting Importance Weighted Variational Inference (IWVI) technique is an instance of augmented variational inference, thus identifying the looseness in previous work. Experiments confirm IWVI's practicality for probabilistic inference. As a second contribution, we investigate inference with elliptical distributions, which improves accuracy in low dimensions, and convergence in high dimensions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/27/2018

Importance Weighting and Variational Inference

Recent work used importance sampling ideas for better variational bounds...
research
10/02/2022

GFlowNets and variational inference

This paper builds bridges between two families of probabilistic algorith...
research
03/01/2021

Challenges and Opportunities in High-dimensional Variational Inference

We explore the limitations of and best practices for using black-box var...
research
09/06/2018

Improving Explorability in Variational Inference with Annealed Variational Objectives

Despite the advances in the representational capacity of approximate dis...
research
02/07/2018

Yes, but Did It Work?: Evaluating Variational Inference

While it's always possible to compute a variational approximation to a p...
research
02/22/2022

Multiple Importance Sampling ELBO and Deep Ensembles of Variational Approximations

In variational inference (VI), the marginal log-likelihood is estimated ...
research
02/24/2022

Estimators of Entropy and Information via Inference in Probabilistic Models

Estimating information-theoretic quantities such as entropy and mutual i...

Please sign up or login with your details

Forgot password? Click here to reset