Challenges and Opportunities in High-dimensional Variational Inference

03/01/2021
by   Akash Kumar Dhaka, et al.
17

We explore the limitations of and best practices for using black-box variational inference to estimate posterior summaries of the model parameters. By taking an importance sampling perspective, we are able to explain and empirically demonstrate: 1) why the intuitions about the behavior of approximate families and divergences for low-dimensional posteriors fail for higher-dimensional posteriors, 2) how we can diagnose the pre-asymptotic reliability of variational inference in practice by examining the behavior of the density ratios (i.e., importance weights), 3) why the choice of variational objective is not as relevant for higher-dimensional posteriors, and 4) why, although flexible variational families can provide some benefits in higher dimensions, they also introduce additional optimization challenges. Based on these findings, for high-dimensional posteriors we recommend using the exclusive KL divergence that is most stable and easiest to optimize, and then focusing on improving the variational family or using model parameter transformations to make the posterior more similar to the approximating family. Our results also show that in low to moderate dimensions, heavy-tailed variational families and mass-covering divergences can increase the chances that the approximation can be improved by importance sampling.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/03/2016

Overdispersed Black-Box Variational Inference

We introduce overdispersed black-box variational inference, a method to ...
research
09/21/2017

Perturbative Black Box Variational Inference

Black box variational inference (BBVI) with reparameterization gradients...
research
10/29/2018

Variational Inference with Tail-adaptive f-Divergence

Variational inference with α-divergences has been widely used in modern ...
research
08/27/2018

Importance Weighting and Varational Inference

Recent work used importance sampling ideas for better variational bounds...
research
01/21/2022

alpha-Deep Probabilistic Inference (alpha-DPI): efficient uncertainty quantification from exoplanet astrometry to black hole feature extraction

Inference is crucial in modern astronomical research, where hidden astro...
research
02/07/2018

Yes, but Did It Work?: Evaluating Variational Inference

While it's always possible to compute a variational approximation to a p...
research
10/02/2022

GFlowNets and variational inference

This paper builds bridges between two families of probabilistic algorith...

Please sign up or login with your details

Forgot password? Click here to reset