Text Generation with Exemplar-based Adaptive Decoding

04/09/2019
by   Hao Peng, et al.
0

We propose a novel conditioned text generation model. It draws inspiration from traditional template-based text generation techniques, where the source provides the content (i.e., what to say), and the template influences how to say it. Building on the successful encoder-decoder paradigm, it first encodes the content representation from the given input text; to produce the output, it retrieves exemplar text from the training data as "soft templates," which are then used to construct an exemplar-specific decoder. We evaluate the proposed model on abstractive text summarization and data-to-text generation. Empirical results show that this model achieves strong performance and outperforms comparable baselines.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/30/2018

Learning Neural Templates for Text Generation

While neural, encoder-decoder models have had significant empirical succ...
research
11/15/2022

AutoTemplate: A Simple Recipe for Lexically Constrained Text Generation

Lexically constrained text generation is one of the constrained text gen...
research
05/23/2022

TempLM: Distilling Language Models into Template-Based Generators

While pretrained language models (PLMs) have greatly improved text gener...
research
08/18/2021

GGP: A Graph-based Grouping Planner for Explicit Control of Long Text Generation

Existing data-driven methods can well handle short text generation. Howe...
research
11/07/2020

Template Controllable keywords-to-text Generation

This paper proposes a novel neural model for the understudied task of ge...
research
02/04/2020

Variational Template Machine for Data-to-Text Generation

How to generate descriptions from structured data organized in tables? E...

Please sign up or login with your details

Forgot password? Click here to reset