Generating Sentences by Editing Prototypes

09/26/2017
by   Kelvin Guu, et al.
0

We propose a new generative model of sentences that first samples a prototype sentence from the training corpus and then edits it into a new sentence. Compared to traditional models that generate from scratch either left-to-right or by first sampling a latent sentence vector, our prototype-then-edit model improves perplexity on language modeling and generates higher quality outputs according to human evaluation. Furthermore, the model gives rise to a latent edit vector that captures interpretable semantics such as sentence similarity and sentence-level analogies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2015

Generating Sentences from a Continuous Space

The standard recurrent neural network language model (RNNLM) generates s...
research
04/15/2021

Sentence-Permuted Paragraph Generation

Generating paragraphs of diverse contents is important in many applicati...
research
04/19/2018

Incorporating Pseudo-Parallel Data for Quantifiable Sequence Editing

In the task of quantifiable sequence editing (QuaSE), a model needs to e...
research
07/03/2020

Interpretable Sequence Classification Via Prototype Trajectory

We propose a novel interpretable recurrent neural network (RNN) model, c...
research
06/19/2018

Response Generation by Context-aware Prototype Editing

Open domain response generation has achieved remarkable progress in rece...
research
06/17/2020

Iterative Edit-Based Unsupervised Sentence Simplification

We present a novel iterative, edit-based approach to unsupervised senten...
research
05/17/2021

Sentence Similarity Based on Contexts

Existing methods to measure sentence similarity are faced with two chall...

Please sign up or login with your details

Forgot password? Click here to reset