Multiple Importance Sampling ELBO and Deep Ensembles of Variational Approximations

02/22/2022
by   Oskar Kviman, et al.
4

In variational inference (VI), the marginal log-likelihood is estimated using the standard evidence lower bound (ELBO), or improved versions as the importance weighted ELBO (IWELBO). We propose the multiple importance sampling ELBO (MISELBO), a versatile yet simple framework. MISELBO is applicable in both amortized and classical VI, and it uses ensembles, e.g., deep ensembles, of independently inferred variational approximations. As far as we are aware, the concept of deep ensembles in amortized VI has not previously been established. We prove that MISELBO provides a tighter bound than the average of standard ELBOs, and demonstrate empirically that it gives tighter bounds than the average of IWELBOs. MISELBO is evaluated in density-estimation experiments that include MNIST and several real-data phylogenetic tree inference problems. First, on the MNIST dataset, MISELBO boosts the density-estimation performances of a state-of-the-art model, nouveau VAE. Second, in the phylogenetic tree inference setting, our framework enhances a state-of-the-art VI algorithm that uses normalizing flows. On top of the technical benefits of MISELBO, it allows to unveil connections between VI and recent advances in the importance sampling literature, paving the way for further methodological advances. We provide our code at <https://github.com/Lagergren-Lab/MISELBO>.

READ FULL TEXT

page 1

page 5

page 6

page 12

page 14

research
09/30/2022

Learning with MISELBO: The Mixture Cookbook

Mixture models in variational inference (VI) is an active field of resea...
research
03/01/2022

VaiPhy: a Variational Inference Based Algorithm for Phylogeny

Phylogenetics is a classical methodology in computational biology that t...
research
06/30/2021

Monte Carlo Variational Auto-Encoders

Variational auto-encoders (VAE) are popular deep latent variable models ...
research
08/27/2018

Importance Weighting and Variational Inference

Recent work used importance sampling ideas for better variational bounds...
research
08/27/2018

Importance Weighting and Varational Inference

Recent work used importance sampling ideas for better variational bounds...
research
10/06/2019

FIS-GAN: GAN with Flow-based Importance Sampling

Generative Adversarial Networks (GAN) training process, in most cases, a...
research
05/04/2023

Quantile Importance Sampling

In Bayesian inference, the approximation of integrals of the form ψ = 𝔼_...

Please sign up or login with your details

Forgot password? Click here to reset