Recursive Inference for Variational Autoencoders

11/17/2020
by   Minyoung Kim, et al.
0

Inference networks of traditional Variational Autoencoders (VAEs) are typically amortized, resulting in relatively inaccurate posterior approximation compared to instance-wise variational optimization. Recent semi-amortized approaches were proposed to address this drawback; however, their iterative gradient update procedures can be computationally demanding. To address these issues, in this paper we introduce an accurate amortized inference algorithm. We propose a novel recursive mixture estimation algorithm for VAEs that iteratively augments the current mixture with new components so as to maximally reduce the divergence between the variational and the true posteriors. Using the functional gradient approach, we devise an intuitive learning criteria for selecting a new mixture component: the new component has to improve the data likelihood (lower bound) and, at the same time, be as divergent from the current mixture distribution as possible, thus increasing representational diversity. Compared to recently proposed boosted variational inference (BVI), our method relies on amortized inference in contrast to BVI's non-amortized single optimization instance. A crucial benefit of our approach is that the inference at test time requires a single feed-forward pass through the mixture inference network, making it significantly faster than the semi-amortized approaches. We show that our approach yields higher test data likelihood than the state-of-the-art on several benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2021

Reducing the Amortization Gap in Variational Autoencoders: A Bayesian Random Function Approach

Variational autoencoder (VAE) is a very successful generative model whos...
research
10/05/2018

Doubly Semi-Implicit Variational Inference

We extend the existing framework of semi-implicit variational inference ...
research
09/30/2019

Tightening Bounds for Variational Inference by Revisiting Perturbation Theory

Variational inference has become one of the most widely used methods in ...
research
11/20/2017

Likelihood Almost Free Inference Networks

Variational inference for latent variable models is prevalent in various...
research
12/01/2021

An adaptive mixture-population Monte Carlo method for likelihood-free inference

This paper focuses on variational inference with intractable likelihood ...
research
01/10/2018

Inference Suboptimality in Variational Autoencoders

Amortized inference has led to efficient approximate inference for large...
research
04/26/2020

Notes on Icebreaker

Icebreaker [1] is new research from MSR that is able to achieve state of...

Please sign up or login with your details

Forgot password? Click here to reset