An enhanced Tree-LSTM architecture for sentence semantic modeling using typed dependencies

02/18/2020
by   Jeena Kleenankandy, et al.
0

Tree-based Long short term memory (LSTM) network has become state-of-the-art for modeling the meaning of language texts as they can effectively exploit the grammatical syntax and thereby non-linear dependencies among words of the sentence. However, most of these models cannot recognize the difference in meaning caused by a change in semantic roles of words or phrases because they do not acknowledge the type of grammatical relations, also known as typed dependencies, in sentence structure. This paper proposes an enhanced LSTM architecture, called relation gated LSTM, which can model the relationship between two inputs of a sequence using a control input. We also introduce a Tree-LSTM model called Typed Dependency Tree-LSTM that uses the sentence dependency parse structure as well as the dependency type to embed sentence meaning into a dense vector. The proposed model outperformed its type-unaware counterpart in two typical NLP tasks - Semantic Relatedness Scoring and Sentiment Analysis, in a lesser number of training epochs. The results were comparable or competitive with other state-of-the-art models. Qualitative analysis showed that changes in the voice of sentences had little effect on the model's predicted scores, while changes in nominal (noun) words had a more significant impact. The model recognized subtle semantic relationships in sentence pairs. The magnitudes of learned typed dependencies embeddings were also in agreement with human intuitions. The research findings imply the significance of grammatical relations in sentence modeling. The proposed models would serve as a base for future researches in this direction.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2022

Recognizing semantic relation in sentence pairs using Tree-RNNs and Typed dependencies

Recursive neural networks (Tree-RNNs) based on dependency trees are ubiq...
research
08/20/2016

phi-LSTM: A Phrase-based Hierarchical LSTM Model for Image Captioning

A picture is worth a thousand words. Not until recently, however, we not...
research
04/30/2020

Context based Text-generation using LSTM networks

Long short-term memory(LSTM) units on sequence-based models are being us...
research
11/26/2018

Sentence Encoding with Tree-constrained Relation Networks

The meaning of a sentence is a function of the relations that hold betwe...
research
02/26/2019

Structure Tree-LSTM: Structure-aware Attentional Document Encoders

We propose a method to create document representations that reflect thei...
research
12/03/2015

Effective LSTMs for Target-Dependent Sentiment Classification

Target-dependent sentiment classification remains a challenge: modeling ...
research
08/18/2018

Learning to Compose over Tree Structures via POS Tags

Recursive Neural Network (RecNN), a type of models which compose words o...

Please sign up or login with your details

Forgot password? Click here to reset