How LSTM Encodes Syntax: Exploring Context Vectors and Semi-Quantization on Natural Text

10/01/2020
by   Chihiro Shibata, et al.
0

Long Short-Term Memory recurrent neural network (LSTM) is widely used and known to capture informative long-term syntactic dependencies. However, how such information are reflected in its internal vectors for natural text has not yet been sufficiently investigated. We analyze them by learning a language model where syntactic structures are implicitly given. We empirically show that the context update vectors, i.e. outputs of internal gates, are approximately quantized to binary or ternary values to help the language model to count the depth of nesting accurately, as Suzgun et al. (2019) recently show for synthetic Dyck languages. For some dimensions in the context vector, we show that their activations are highly correlated with the depth of phrase structures, such as VP and NP. Moreover, with an L_1 regularization, we also found that it can accurately predict whether a word is inside a phrase structure or not from a small number of components of the context vector. Even for the case of learning from raw text, context vectors are shown to still correlate well with the phrase structures. Finally, we show that natural clusters of the functional words and the part of speeches that trigger phrases are represented in a small but principal subspace of the context-update vector of LSTM.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/30/2020

Context based Text-generation using LSTM networks

Long short-term memory(LSTM) units on sequence-based models are being us...
research
11/01/2016

Recurrent Neural Network Language Model Adaptation Derived Document Vector

In many natural language processing (NLP) tasks, a document is commonly ...
research
03/22/2016

Semi-supervised Word Sense Disambiguation with Neural Models

Determining the intended sense of words in text - word sense disambiguat...
research
08/20/2016

phi-LSTM: A Phrase-based Hierarchical LSTM Model for Image Captioning

A picture is worth a thousand words. Not until recently, however, we not...
research
04/28/2016

Word Ordering Without Syntax

Recent work on word ordering has argued that syntactic structure is impo...
research
10/29/2017

A Neural-Symbolic Approach to Natural Language Tasks

Deep learning (DL) has in recent years been widely used in natural langu...
research
05/06/2019

Comprehensible Context-driven Text Game Playing

In order to train a computer agent to play a text-based computer game, w...

Please sign up or login with your details

Forgot password? Click here to reset