Learning Neural Templates for Text Generation

08/30/2018
by   Sam Wiseman, et al.
0

While neural, encoder-decoder models have had significant empirical success in text generation, there remain several unaddressed problems with this style of generation. Encoder-decoder models are largely (a) uninterpretable, and (b) difficult to control in terms of their phrasing or content. This work proposes a neural generation system using a hidden semi-markov model (HSMM) decoder, which learns latent, discrete templates jointly with learning to generate. We show that this model learns useful templates, and that these templates make generation both more interpretable and controllable. Furthermore, we show that this approach scales to real data sets and achieves strong performance nearing that of encoder-decoder text generation models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Text Generation with Exemplar-based Adaptive Decoding

We propose a novel conditioned text generation model. It draws inspirati...
research
09/10/2019

Select and Attend: Towards Controllable Content Selection in Text Generation

Many text generation tasks naturally contain two steps: content selectio...
research
06/03/2019

A Semi-Supervised Approach for Low-Resourced Text Generation

Recently, encoder-decoder neural models have achieved great success on t...
research
05/19/2023

DiffuSIA: A Spiral Interaction Architecture for Encoder-Decoder Text Diffusion

Diffusion models have emerged as the new state-of-the-art family of deep...
research
11/29/2020

Latent Template Induction with Gumbel-CRFs

Learning to control the structure of sentences is a challenging problem ...
research
02/04/2020

Variational Template Machine for Data-to-Text Generation

How to generate descriptions from structured data organized in tables? E...
research
09/26/2019

Low-Resource Response Generation with Template Prior

We study open domain response generation with limited message-response p...

Please sign up or login with your details

Forgot password? Click here to reset