DeepAI AI Chat
Log In Sign Up

Compositional Distributional Semantics with Long Short Term Memory

by   Phong Le, et al.
University of Amsterdam

We are proposing an extension of the recursive neural network that makes use of a variant of the long short-term memory architecture. The extension allows information low in parse trees to be stored in a memory register (the `memory cell') and used much later higher up in the parse tree. This provides a solution to the vanishing gradient problem and allows the network to capture long range dependencies. Experimental results show that our composition outperformed the traditional neural-network composition on the Stanford Sentiment Treebank.


page 1

page 2

page 3

page 4


Long Short-Term Memory Over Tree Structures

The chain-structured long short-term memory (LSTM) has showed to be effe...

Long Short-Term Network Based Unobtrusive Perceived Workload Monitoring with Consumer Grade Smartwatches in the Wild

Continuous high perceived workload has a negative impact on the individu...

'Moving On' – Investigating Inventors' Ethnic Origins Using Supervised Learning

Patent data provides rich information about technical inventions, but do...

Long Short-Term Sample Distillation

In the past decade, there has been substantial progress at training incr...

A Survey On 3D Inner Structure Prediction from its Outer Shape

The analysis of the internal structure of trees is highly important for ...