DeepAI AI Chat
Log In Sign Up

Stochastic Backpropagation and Approximate Inference in Deep Generative Models

01/16/2014
by   Danilo Jimenez Rezende, et al.
Google
0

We marry ideas from deep neural networks and approximate Bayesian inference to derive a generalised class of deep, directed generative models, endowed with a new algorithm for scalable inference and learning. Our algorithm introduces a recognition model to represent approximate posterior distributions, and that acts as a stochastic encoder of the data. We develop stochastic back-propagation -- rules for back-propagation through stochastic variables -- and use this to develop an algorithm that allows for joint optimisation of the parameters of both the generative and recognition model. We demonstrate on several real-world data sets that the model generates realistic samples, provides accurate imputations of missing data and is a useful tool for high-dimensional data visualisation.

READ FULL TEXT

page 6

page 7

05/28/2018

Flexible and accurate inference and learning for deep generative models

We introduce a new approach to learning in hierarchical latent-variable ...
06/12/2015

Bidirectional Helmholtz Machines

Efficient unsupervised training and inference in deep generative models ...
11/04/2019

Amortized Population Gibbs Samplers with Neural Sufficient Statistics

We develop amortized population Gibbs (APG) samplers, a new class of aut...
11/09/2020

Deep Bayesian Nonparametric Factor Analysis

We propose a deep generative factor analysis model with beta process pri...
12/19/2018

Fast Approximate Geodesics for Deep Generative Models

The length of the geodesic between two data points along the Riemannian ...
11/30/2017

Learning to Adapt by Minimizing Discrepancy

We explore whether useful temporal neural generative models can be learn...

Code Repositories

dgm

Deep Generative Model (Torch)


view repo

vae-mxnet

MXNet/Gluon implementation of Variational Autoencoders (VAE)


view repo