On the relationship between variational inference and adaptive importance sampling

07/24/2019
by   Axel Finke, et al.
0

The importance weighted autoencoder (IWAE) (Burda et al., 2016) and reweighted wake-sleep (RWS) algorithm (Bornschein and Bengio, 2015) are popular approaches which employ multiple samples to achieve bias reductions compared to standard variational methods. However, their relationship has hitherto been unclear. We introduce a simple, unified framework for multi-sample variational inference termed adaptive importance sampling for learning (AISLE) and show that it admits IWAE and RWS as special cases. Through a principled application of a variance-reduction technique from Tucker et al. (2019), we also show that the sticking-the-landing (STL) gradient from Roeder et al. (2017), which previously lacked theoretical justification, can be recovered as a special case of RWS (and hence of AISLE). In particular, this indicates that the breakdown of RWS -- but not of STL -- observed in Tucker et al. (2019) may not be attributable to the lack of a joint objective for the generative-model and inference-network parameters as previously conjectured. Finally, we argue that our adaptive-importance-sampling interpretation of variational inference leads to more natural and principled extensions to sequential Monte Carlo methods than the IWAE-type multi-sample objective interpretation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/24/2019

On importance-weighted autoencoders

The importance weighted autoencoder (IWAE) (Burda et al., 2016) is a pop...
research
02/22/2016

Variational inference for Monte Carlo objectives

Recent progress in deep latent variable models has largely been driven b...
research
06/18/2020

An adversarial algorithm for variational inference with a new role for acetylcholine

Sensory learning in the mammalian cortex has long been hypothesized to i...
research
03/23/2015

On some provably correct cases of variational inference for topic models

Variational inference is a very efficient and popular heuristic used in ...
research
10/09/2018

Doubly Reparameterized Gradient Estimators for Monte Carlo Objectives

Deep latent variable models have become a popular model choice due to th...
research
09/23/2022

Tighter Variational Bounds are Not Necessarily Better. A Research Report on Implementation, Ablation Study, and Extensions

This report explains, implements and extends the works presented in "Tig...
research
10/02/2022

GFlowNets and variational inference

This paper builds bridges between two families of probabilistic algorith...

Please sign up or login with your details

Forgot password? Click here to reset