Variational Inference via Transformations on Distributions

07/09/2017
by   Siddhartha Saxena, et al.
0

Variational inference methods often focus on the problem of efficient model optimization, with little emphasis on the choice of the approximating posterior. In this paper, we review and implement the various methods that enable us to develop a rich family of approximating posteriors. We show that one particular method employing transformations on distributions results in developing very rich and complex posterior approximation. We analyze its performance on the MNIST dataset by implementing with a Variational Autoencoder and demonstrate its effectiveness in learning better posterior distributions.

READ FULL TEXT
research
05/21/2015

Variational Inference with Normalizing Flows

The choice of approximate posterior distribution is one of the core prob...
research
12/01/2020

Improved Variational Bayesian Phylogenetic Inference with Normalizing Flows

Variational Bayesian phylogenetic inference (VBPI) provides a promising ...
research
11/11/2018

Multi-Source Neural Variational Inference

Learning from multiple sources of information is an important problem in...
research
04/21/2018

Variational Inference In Pachinko Allocation Machines

The Pachinko Allocation Machine (PAM) is a deep topic model that allows ...
research
01/10/2018

Inference Suboptimality in Variational Autoencoders

Amortized inference has led to efficient approximate inference for large...
research
03/18/2019

Approximating exponential family models (not single distributions) with a two-network architecture

Recently much attention has been paid to deep generative models, since t...
research
02/27/2020

Gradient Boosted Flows

Normalizing flows (NF) are a powerful framework for approximating poster...

Please sign up or login with your details

Forgot password? Click here to reset