DeepAI AI Chat
Log In Sign Up

Compositional Distributional Semantics with Long Short Term Memory

03/09/2015
by   Phong Le, et al.
University of Amsterdam
0

We are proposing an extension of the recursive neural network that makes use of a variant of the long short-term memory architecture. The extension allows information low in parse trees to be stored in a memory register (the `memory cell') and used much later higher up in the parse tree. This provides a solution to the vanishing gradient problem and allows the network to capture long range dependencies. Experimental results show that our composition outperformed the traditional neural-network composition on the Stanford Sentiment Treebank.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/16/2015

Long Short-Term Memory Over Tree Structures

The chain-structured long short-term memory (LSTM) has showed to be effe...
11/30/2019

Long Short-Term Network Based Unobtrusive Perceived Workload Monitoring with Consumer Grade Smartwatches in the Wild

Continuous high perceived workload has a negative impact on the individu...
01/03/2022

'Moving On' – Investigating Inventors' Ethnic Origins Using Supervised Learning

Patent data provides rich information about technical inventions, but do...
03/02/2020

Long Short-Term Sample Distillation

In the past decade, there has been substantial progress at training incr...
02/11/2020

A Survey On 3D Inner Structure Prediction from its Outer Shape

The analysis of the internal structure of trees is highly important for ...