Mixtures of Sparse Autoregressive Networks

11/15/2015
by   Marc Goessling, et al.
0

We consider high-dimensional distribution estimation through autoregressive networks. By combining the concepts of sparsity, mixtures and parameter sharing we obtain a simple model which is fast to train and which achieves state-of-the-art or better results on several standard benchmark datasets. Specifically, we use an L1-penalty to regularize the conditional distributions and introduce a procedure for automatic parameter sharing between mixture components. Moreover, we propose a simple distributed representation which permits exact likelihood evaluations since the latent variables are interleaved with the observable variables and can be easily integrated out. Our model achieves excellent generalization performance and scales well to extremely high dimensions.

READ FULL TEXT

page 6

page 7

page 8

research
09/05/2019

FlowSeq: Non-Autoregressive Conditional Sequence Generation with Generative Flow

Most sequence-to-sequence (seq2seq) models are autoregressive; they gene...
research
03/22/2017

LogitBoost autoregressive networks

Multivariate binary distributions can be decomposed into products of uni...
research
11/30/2017

Auxiliary Guided Autoregressive Variational Autoencoders

Generative modeling of high-dimensional data is a key problem in machine...
research
09/16/2019

Global Autoregressive Models for Data-Efficient Sequence Learning

Standard autoregressive seq2seq models are easily trained by max-likelih...
research
07/21/2013

Mixtures of Common Skew-t Factor Analyzers

A mixture of common skew-t factor analyzers model is introduced for mode...
research
02/24/2022

On Learning Mixture Models with Sparse Parameters

Mixture models are widely used to fit complex and multimodal datasets. I...
research
03/23/2023

Adversarially Contrastive Estimation of Conditional Neural Processes

Conditional Neural Processes (CNPs) formulate distributions over functio...

Please sign up or login with your details

Forgot password? Click here to reset