Ladder Variational Autoencoders

02/06/2016
by   Casper Kaae Sønderby, et al.
0

Variational Autoencoders are powerful models for unsupervised learning. However deep models with several layers of dependent stochastic variables are difficult to train which limits the improvements obtained using these highly expressive models. We propose a new inference model, the Ladder Variational Autoencoder, that recursively corrects the generative distribution by a data dependent approximate likelihood in a process resembling the recently proposed Ladder Network. We show that this model provides state of the art predictive log-likelihood and tighter log-likelihood lower bound compared to the purely bottom-up inference in layered Variational Autoencoders and other generative models. We provide a detailed analysis of the learned hierarchical latent representation and show that our new inference model is qualitatively different and utilizes a deeper more distributed hierarchy of latent variables. Finally, we observe that batch normalization and deterministic warm-up (gradually turning on the KL-term) are crucial for training variational models with many stochastic layers.

READ FULL TEXT

page 6

page 11

page 12

research
06/15/2023

Tree Variational Autoencoders

We propose a new generative hierarchical clustering model that learns a ...
research
09/21/2023

Variational Connectionist Temporal Classification for Order-Preserving Sequence Modeling

Connectionist temporal classification (CTC) is commonly adopted for sequ...
research
07/05/2018

Learning in Variational Autoencoders with Kullback-Leibler and Renyi Integral Bounds

In this paper we propose two novel bounds for the log-likelihood based o...
research
04/12/2023

Explicitly Minimizing the Blur Error of Variational Autoencoders

Variational autoencoders (VAEs) are powerful generative modelling method...
research
05/27/2016

Density estimation using Real NVP

Unsupervised learning of probabilistic models is a central yet challengi...
research
11/24/2021

A Unified Approach to Variational Autoencoders and Stochastic Normalizing Flows via Markov Chains

Normalizing flows, diffusion normalizing flows and variational autoencod...
research
09/23/2015

Deep Temporal Sigmoid Belief Networks for Sequence Modeling

Deep dynamic generative models are developed to learn sequential depende...

Please sign up or login with your details

Forgot password? Click here to reset