Incorporating Discriminator in Sentence Generation: a Gibbs Sampling Method

02/25/2018
by   Jinyue Su, et al.
0

Generating plausible and fluent sentence with desired properties has long been a challenge. Most of the recent works use recurrent neural networks (RNNs) and their variants to predict following words given previous sequence and target label. In this paper, we propose a novel framework to generate constrained sentences via Gibbs Sampling. The candidate sentences are revised and updated iteratively, with sampled new words replacing old ones. Our experiments show the effectiveness of the proposed method to generate plausible and diverse sentences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2018

CGMH: Constrained Sentence Generation by Metropolis-Hastings Sampling

In real-world applications of natural language generation, there are oft...
research
09/30/2018

Text Morphing

In this paper, we introduce a novel natural language generation task, te...
research
04/07/2016

Sentence Level Recurrent Topic Model: Letting Topics Speak for Themselves

We propose Sentence Level Recurrent Topic Model (SLRTM), a new topic mod...
research
04/07/2017

A Constrained Sequence-to-Sequence Neural Model for Sentence Simplification

Sentence simplification reduces semantic complexity to benefit people wi...
research
06/21/2018

BFGAN: Backward and Forward Generative Adversarial Networks for Lexically Constrained Sentence Generation

In many natural language generation tasks, incorporating additional know...
research
09/13/2021

Show Me How To Revise: Improving Lexically Constrained Sentence Generation with XLNet

Lexically constrained sentence generation allows the incorporation of pr...
research
05/31/2020

"Judge me by my size (noun), do you?” YodaLib: A Demographic-Aware Humor Generation Framework

The subjective nature of humor makes computerized humor generation a cha...

Please sign up or login with your details

Forgot password? Click here to reset