Deep State Space Models for Unconditional Word Generation

06/12/2018
by   Florian Schmidt, et al.
0

Autoregressive feedback is considered a necessity for successful unconditional text generation using stochastic sequence models. However, such feedback is known to introduce systematic biases into the training and it obscures a principle of generation: committing to global information and forgetting local nuances. We show that a non-autoregressive deep state space model with a clear separation of global and local uncertainty can be build from only two ingredients: An independent noise source and a deterministic transition function. Recent advances on flow-based variational inference allow training an evidence lower-bound without resorting to annealing, auxiliary losses or similar measures. The result is a highly interpretable generative model on par with a comparable auto-regressive model on the task of word generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2019

Autoregressive Text Generation Beyond Feedback Loops

Autoregressive state transitions, where predictions are conditioned on p...
research
10/02/2019

Scalable approximate inference for state space models with normalising flows

By exploiting mini-batch stochastic gradient optimisation, variational i...
research
05/25/2015

Stochastic Annealing for Variational Inference

We empirically evaluate a stochastic annealing strategy for Bayesian pos...
research
01/29/2019

Latent Normalizing Flows for Discrete Sequences

Normalizing flows have been shown to be a powerful class of generative m...
research
05/20/2016

Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data

We introduce Deep Variational Bayes Filters (DVBF), a new method for uns...
research
04/27/2021

Text Generation with Deep Variational GAN

Generating realistic sequences is a central task in many machine learnin...

Please sign up or login with your details

Forgot password? Click here to reset