Context based Text-generation using LSTM networks

04/30/2020
by   Sivasurya Santhanam, et al.
0

Long short-term memory(LSTM) units on sequence-based models are being used in translation, question-answering systems, classification tasks due to their capability of learning long-term dependencies. In Natural language generation, LSTM networks are providing impressive results on text generation models by learning language models with grammatically stable syntaxes. But the downside is that the network does not learn about the context. The network only learns the input-output function and generates text given a set of input words irrespective of pragmatics. As the model is trained without any such context, there is no semantic consistency among the generated sentences. The proposed model is trained to generate text for a given set of input words along with a context vector. A context vector is similar to a paragraph vector that grasps the semantic meaning(context) of the sentence. Several methods of extracting the context vectors are proposed in this work. While training a language model, in addition to the input-output sequences, context vectors are also trained along with the inputs. Due to this structure, the model learns the relation among the input words, context vector and the target word. Given a set of context terms, a well trained model will generate text around the provided context. Based on the nature of computing context vectors, the model has been tried out with two variations (word importance and word clustering). In the word clustering method, the suitable embeddings among various domains are also explored. The results are evaluated based on the semantic closeness of the generated text to the given context.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/06/2017

Long-Term Memory Networks for Question Answering

Question answering is an important and difficult task in the natural lan...
research
10/01/2020

How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text

Long Short-Term Memory recurrent neural network (LSTM) is widely used an...
research
05/30/2014

Semantic Composition and Decomposition: From Recognition to Generation

Semantic composition is the task of understanding the meaning of text by...
research
07/17/2018

Guess who? Multilingual approach for the automated generation of author-stylized poetry

This paper addresses the problem of stylized text generation in a multil...
research
07/07/2016

Predicting and Understanding Law-Making with Word Vectors and an Ensemble Model

Out of nearly 70,000 bills introduced in the U.S. Congress from 2001 to ...
research
02/18/2020

An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies

Tree-based Long short term memory (LSTM) network has become state-of-the...
research
05/11/2018

Using Statistical and Semantic Models for Multi-Document Summarization

We report a series of experiments with different semantic models on top ...

Please sign up or login with your details

Forgot password? Click here to reset