LSTM with Working Memory

05/06/2016
by   Andrew Pulver, et al.
0

Previous RNN architectures have largely been superseded by LSTM, or "Long Short-Term Memory". Since its introduction, there have been many variations on this simple design. However, it is still widely used and we are not aware of a gated-RNN architecture that outperforms LSTM in a broad sense while still being as simple and efficient. In this paper we propose a modified LSTM-like architecture. Our architecture is still simple and achieves better performance on the tasks that we tested on. We also introduce a new RNN performance benchmark that uses the handwritten digits and stresses several important network capabilities.

READ FULL TEXT

page 2

page 5

research
03/26/2017

Learning Simpler Language Models with the Differential State Framework

Learning useful information across long time lags is a critical and diff...
research
04/22/2016

Bridging LSTM Architecture and the Neural Dynamics during Reading

Recently, the long short-term memory neural network (LSTM) has attracted...
research
07/19/2019

Benchmarking a Catchment-Aware Long Short-Term Memory Network (LSTM) for Large-Scale Hydrological Modeling

Regional rainfall-runoff modeling is an old but still mostly out-standin...
research
06/06/2020

Do RNN and LSTM have Long Memory?

The LSTM network was proposed to overcome the difficulty in learning lon...
research
08/09/2018

Fundamentals of Recurrent Neural Network (RNN) and Long Short-Term Memory (LSTM) Network

Because of their effectiveness in broad practical applications, LSTM net...
research
05/22/2018

EcoRNN: Fused LSTM RNN Implementation with Data Layout Optimization

Long-Short-Term-Memory Recurrent Neural Network (LSTM RNN) is a state-of...
research
04/03/2017

Syntax Aware LSTM Model for Chinese Semantic Role Labeling

As for semantic role labeling (SRL) task, when it comes to utilizing par...

Please sign up or login with your details

Forgot password? Click here to reset