Backward and Forward Language Modeling for Constrained Sentence Generation

12/21/2015
by   Lili Mou, et al.
0

Recent language models, especially those based on recurrent neural networks (RNNs), make it possible to generate natural language from a learned probability. Language generation has wide applications including machine translation, summarization, question answering, conversation systems, etc. Existing methods typically learn a joint probability of words conditioned on additional information, which is (either statically or dynamically) fed to RNN's hidden layer. In many applications, we are likely to impose hard constraints on the generated texts, i.e., a particular word must appear in the sentence. Unfortunately, existing approaches could not solve this problem. In this paper, we propose a novel backward and forward language model. Provided a specific word, we use RNNs to generate previous words and future words, either simultaneously or asynchronously, resulting in two model variants. In this way, the given word could appear at any position in the sentence. Experimental results show that the generated texts are comparable to sequential LMs in quality.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2018

BFGAN: Backward and Forward Generative Adversarial Networks for Lexically Constrained Sentence Generation

In many natural language generation tasks, incorporating additional know...
research
11/14/2018

CGMH: Constrained Sentence Generation by Metropolis-Hastings Sampling

In real-world applications of natural language generation, there are oft...
research
03/14/2021

Learning a Word-Level Language Model with Sentence-Level Noise Contrastive Estimation for Contextual Sentence Probability Estimation

Inferring the probability distribution of sentences or word sequences is...
research
05/30/2019

TS-RNN: Text Steganalysis Based on Recurrent Neural Networks

With the rapid development of natural language processing technologies, ...
research
08/23/2018

The Importance of Generation Order in Language Modeling

Neural language models are a critical component of state-of-the-art syst...
research
11/13/2018

Modeling Local Dependence in Natural Language with Multi-channel Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely used in processing nat...
research
03/23/2023

Return of the RNN: Residual Recurrent Networks for Invertible Sentence Embeddings

This study presents a novel model for invertible sentence embeddings usi...

Please sign up or login with your details

Forgot password? Click here to reset