MADE: Masked Autoencoder for Distribution Estimation

02/12/2015
by   Mathieu Germain, et al.
0

There has been a lot of recent interest in designing neural network models to estimate a distribution from a set of examples. We introduce a simple modification for autoencoder neural networks that yields powerful generative models. Our method masks the autoencoder's parameters to respect autoregressive constraints: each input is reconstructed only from previous inputs in a given ordering. Constrained this way, the autoencoder outputs can be interpreted as a set of conditional probabilities, and their product, the full joint probability. We can also train a single network that can decompose the joint probability in multiple different orderings. Our simple framework can be applied to multiple architectures, including deep ones. Vectorized implementations, such as on GPUs, are simple and fast. Experiments demonstrate that this approach is competitive with state-of-the-art tractable distribution estimators. At test time, the method is significantly faster and scales better than other autoregressive estimators.

READ FULL TEXT
research
06/22/2020

Locally Masked Convolution for Autoregressive Models

High-dimensional generative models have many applications including imag...
research
05/07/2016

Neural Autoregressive Distribution Estimation

We present Neural Autoregressive Distribution Estimation (NADE) models, ...
research
12/17/2019

HCNAF: Hyper-Conditioned Neural Autoregressive Flow and its Application for Probabilistic Occupancy Map Forecasting

We introduce Hyper-Conditioned Neural Autoregressive Flow (HCNAF); a pow...
research
11/23/2017

An Improved Training Procedure for Neural Autoregressive Data Completion

Neural autoregressive models are explicit density estimators that achiev...
research
08/23/2017

Hierarchical Multinomial-Dirichlet model for the estimation of conditional probability tables

We present a novel approach for estimating conditional probability table...
research
03/25/2023

Autoregressive Conditional Neural Processes

Conditional neural processes (CNPs; Garnelo et al., 2018a) are attractiv...
research
06/13/2021

The DEformer: An Order-Agnostic Distribution Estimating Transformer

Order-agnostic autoregressive distribution estimation (OADE), i.e., auto...

Please sign up or login with your details

Forgot password? Click here to reset