DeepAI
Log In Sign Up

Ladder Variational Autoencoders

02/06/2016
by   Casper Kaae Sønderby, et al.
0

Variational Autoencoders are powerful models for unsupervised learning. However deep models with several layers of dependent stochastic variables are difficult to train which limits the improvements obtained using these highly expressive models. We propose a new inference model, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process resembling the recently proposed Ladder Network. We show that this model provides state of the art predictive log-likelihood and tighter log-likelihood lower bound compared to the purely bottom-up inference in layered Variational Autoencoders and other generative models. We provide a detailed analysis of the learned hierarchical latent representation and show that our new inference model is qualitatively different and utilizes a deeper more distributed hierarchy of latent variables. Finally, we observe that batch normalization and deterministic warm-up (gradually turning on the KL-term) are crucial for training variational models with many stochastic layers.

READ FULL TEXT

page 6

page 11

page 12

07/11/2017

Least Square Variational Bayesian Autoencoder with Regularization

In recent years Variation Autoencoders have become one of the most popul...
07/14/2020

Relaxed-Responsibility Hierarchical Discrete VAEs

Successfully training Variational Autoencoders (VAEs) with a hierarchy o...
07/05/2018

Learning in Variational Autoencoders with Kullback-Leibler and Renyi Integral Bounds

In this paper we propose two novel bounds for the log-likelihood based o...
03/01/2021

A survey on Variational Autoencoders from a GreenAI perspective

Variational AutoEncoders (VAEs) are powerful generative models that merg...
05/19/2022

Closing the gap: Exact maximum likelihood training of generative autoencoders using invertible layers

In this work, we provide an exact likelihood alternative to the variatio...
11/24/2021

A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains

Normalizing flows, diffusion normalizing flows and variational autoencod...
09/23/2015

Deep Temporal Sigmoid Belief Networks for Sequence Modeling

Deep dynamic generative models are developed to learn sequential depende...

Code Repositories

LVAE

Code for "How to Train Deep Variational Autoencoders and Probabilistic Ladder Networks"


view repo