Global Autoregressive Models for Data-Efficient Sequence Learning

by   Tetiana Parshakova, et al.
Stanford University

Standard autoregressive seq2seq models are easily trained by max-likelihood, but tend to show poor results under small-data conditions. We introduce a class of seq2seq models, GAMs (Global Autoregressive Models), which combine an autoregressive component with a log-linear component, allowing the use of global a priori features to compensate for lack of data. We train these models in two steps. In the first step, we obtain an unnormalized GAM that maximizes the likelihood of the data, but is improper for fast inference or evaluation. In the second step, we use this GAM to train (by distillation) a second autoregressive model that approximates the normalized distribution associated with the GAM, and can be used for fast inference and evaluation. Our experiments focus on language modelling under synthetic conditions and show a strong perplexity reduction of using the second autoregressive model over the standard one.


page 2

page 3

page 5

page 6

page 7

page 8

page 10

page 11


ENGINE: Energy-Based Inference Networks for Non-Autoregressive Machine Translation

We propose to train a non-autoregressive machine translation model to mi...

CUNI Non-Autoregressive System for the WMT 22 Efficient Translation Shared Task

We present a non-autoregressive system submission to the WMT 22 Efficien...

Parallel and Flexible Sampling from Autoregressive Models via Langevin Dynamics

This paper introduces an alternative approach to sampling from autoregre...

Surprisal-Triggered Conditional Computation with Neural Networks

Autoregressive neural network models have been used successfully for seq...

Mixtures of Sparse Autoregressive Networks

We consider high-dimensional distribution estimation through autoregress...

Pairwise likelihood estimation of latent autoregressive count models

Latent autoregressive models are useful time series models for the analy...

Please sign up or login with your details

Forgot password? Click here to reset