Latent Template Induction with Gumbel-CRFs

by   Yao Fu, et al.

Learning to control the structure of sentences is a challenging problem in text generation. Existing work either relies on simple deterministic approaches or RL-based hard structures. We explore the use of structured variational autoencoders to infer latent templates for sentence generation using a soft, continuous relaxation in order to utilize reparameterization for training. Specifically, we propose a Gumbel-CRF, a continuous relaxation of the CRF sampling algorithm using a relaxed Forward-Filtering Backward-Sampling (FFBS) approach. As a reparameterized gradient estimator, the Gumbel-CRF gives more stable gradients than score-function based estimators. As a structured inference network, we show that it learns interpretable templates during training, which allows us to control the decoder during testing. We demonstrate the effectiveness of our methods with experiments on data-to-text generation and unsupervised paraphrase generation.


page 1

page 2

page 3

page 4


Learning Neural Templates for Text Generation

While neural, encoder-decoder models have had significant empirical succ...

Latent Code and Text-based Generative Adversarial Networks for Soft-text Generation

Text generation with generative adversarial networks (GANs) can be divid...

Latent Space Secrets of Denoising Text-Autoencoders

While neural language models have recently demonstrated impressive perfo...

Variational Template Machine for Data-to-Text Generation

How to generate descriptions from structured data organized in tables? E...

Sticking to the Facts: Confident Decoding for Faithful Data-to-Text Generation

Neural conditional text generation systems have achieved significant pro...

Discovering Textual Structures: Generative Grammar Induction using Template Trees

Natural language generation provides designers with methods for automati...

Template Controllable keywords-to-text Generation

This paper proposes a novel neural model for the understudied task of ge...