CGMH: Constrained Sentence Generation by Metropolis-Hastings Sampling

11/14/2018
by   Ning Miao, et al.
0

In real-world applications of natural language generation, there are often constraints on the target sentences in addition to fluency and naturalness requirements. Existing language generation techniques are usually based on recurrent neural networks (RNNs). However, it is non-trivial to impose constraints on RNNs while maintaining generation quality, since RNNs generate sentences sequentially (or with beam search) from the first word to the last. In this paper, we propose CGMH, a novel approach using Metropolis-Hastings sampling for constrained sentence generation. CGMH allows complicated constraints such as the occurrence of multiple keywords in the target sentences, which cannot be handled in traditional RNN-based approaches. Moreover, CGMH works in the inference stage, and does not require parallel corpora for training. We evaluate our method on a variety of tasks, including keywords-to-sentence generation, unsupervised sentence paraphrasing, and unsupervised sentence error correction. CGMH achieves high performance compared with previous supervised methods for sentence generation. Our code is released at https://github.com/NingMiao/CGMH

READ FULL TEXT
research
12/21/2015

Backward and Forward Language Modeling for Constrained Sentence Generation

Recent language models, especially those based on recurrent neural netwo...
research
02/25/2018

Incorporating Discriminator in Sentence Generation: a Gibbs Sampling Method

Generating plausible and fluent sentence with desired properties has lon...
research
09/09/2019

Unsupervised Paraphrasing by Simulated Annealing

Unsupervised paraphrase generation is a promising and important research...
research
09/13/2021

Show Me How To Revise: Improving Lexically Constrained Sentence Generation with XLNet

Lexically constrained sentence generation allows the incorporation of pr...
research
11/05/2019

Language coverage and generalization in RNN-based continuous sentence embeddings for interacting agents

Continuous sentence embeddings using recurrent neural networks (RNNs), w...
research
12/16/2021

Idiomatic Expression Paraphrasing without Strong Supervision

Idiomatic expressions (IEs) play an essential role in natural language. ...
research
05/31/2021

Effective Batching for Recurrent Neural Network Grammars

As a language model that integrates traditional symbolic operations and ...

Please sign up or login with your details

Forgot password? Click here to reset