Improved Semantic Representations From Tree-Structured Long Short-Term Memory Networks

02/28/2015
by   Kai Sheng Tai, et al.
0

Because of their superior ability to preserve sequence information over time, Long Short-Term Memory (LSTM) networks, a type of recurrent neural network with a more complex computational unit, have obtained strong results on a variety of sequence modeling tasks. The only underlying LSTM structure that has been explored so far is a linear chain. However, natural language exhibits syntactic properties that would naturally combine words to phrases. We introduce the Tree-LSTM, a generalization of LSTMs to tree-structured network topologies. Tree-LSTMs outperform all existing systems and strong LSTM baselines on two tasks: predicting the semantic relatedness of two sentences (SemEval 2014, Task 1) and sentiment classification (Stanford Sentiment Treebank).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2015

Top-down Tree Long Short-Term Memory Networks

Long Short-Term Memory (LSTM) networks, a type of recurrent neural netwo...
research
02/08/2017

Automatic Rule Extraction from Long Short Term Memory Networks

Although deep learning models have proven effective at solving problems ...
research
11/15/2017

A Sequential Neural Encoder with Latent Structured Description for Modeling Sentences

In this paper, we propose a sequential neural encoder with latent struct...
research
10/25/2019

A memory enhanced LSTM for modeling complex temporal dependencies

In this paper, we present Gamma-LSTM, an enhanced long short term memory...
research
06/22/2018

Persistent Hidden States and Nonlinear Transformation for Long Short-Term Memory

Recurrent neural networks (RNNs) have been drawing much attention with g...
research
10/22/2018

Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks

Recurrent neural network (RNN) models are widely used for processing seq...
research
11/04/2015

Semi-supervised Sequence Learning

We present two approaches that use unlabeled data to improve sequence le...

Please sign up or login with your details

Forgot password? Click here to reset