Compositional Distributional Semantics with Long Short Term Memory

03/09/2015
by   Phong Le, et al.
0

We are proposing an extension of the recursive neural network that makes use of a variant of the long short-term memory architecture. The extension allows information low in parse trees to be stored in a memory register (the `memory cell') and used much later higher up in the parse tree. This provides a solution to the vanishing gradient problem and allows the network to capture long range dependencies. Experimental results show that our composition outperformed the traditional neural-network composition on the Stanford Sentiment Treebank.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/16/2015

Long Short-Term Memory Over Tree Structures

The chain-structured long short-term memory (LSTM) has showed to be effe...
research
03/01/2016

Quantifying the vanishing gradient and long distance dependency problem in recursive neural networks and recursive LSTMs

Recursive neural networks (RNN) and their recently proposed extension re...
research
01/30/2023

Long Short-Term Memory Neural Network for Temperature Prediction in Laser Powder Bed Additive Manufacturing

In context of laser powder bed fusion (L-PBF), it is known that the prop...
research
11/30/2019

Long Short-Term Network Based Unobtrusive Perceived Workload Monitoring with Consumer Grade Smartwatches in the Wild

Continuous high perceived workload has a negative impact on the individu...
research
01/03/2022

'Moving On' – Investigating Inventors' Ethnic Origins Using Supervised Learning

Patent data provides rich information about technical inventions, but do...
research
02/18/2023

Neural Attention Memory

We propose a novel perspective of the attention mechanism by reinventing...
research
12/23/2017

An Approximate Bayesian Long Short-Term Memory Algorithm for Outlier Detection

Long Short-Term Memory networks trained with gradient descent and back-p...

Please sign up or login with your details

Forgot password? Click here to reset