Simplified Long Short-term Memory Recurrent Neural Networks: part III

07/14/2017
by   Atra Akandeh, et al.
0

This is part III of three-part work. In parts I and II, we have presented eight variants for simplified Long Short Term Memory (LSTM) recurrent neural networks (RNNs). It is noted that fast computation, specially in constrained computing resources, are an important factor in processing big time-sequence data. In this part III paper, we present and evaluate two new LSTM model variants which dramatically reduce the computational load while retaining comparable performance to the base (standard) LSTM RNNs. In these new variants, we impose (Hadamard) pointwise state multiplications in the cell-memory network in addition to the gating signal networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2017

Simplified Long Short-term Memory Recurrent Neural Networks: part II

This is part II of three-part work. Here, we present a second set of int...
research
12/29/2018

SLIM LSTMs

Long Short-Term Memory (LSTM) Recurrent Neural networks (RNNs) rely on g...
research
04/18/2018

Fast Weight Long Short-Term Memory

Associative memory using fast weights is a short-term memory mechanism t...
research
10/28/2019

On Generalization Bounds of a Family of Recurrent Neural Networks

Recurrent Neural Networks (RNNs) have been widely applied to sequential ...
research
05/01/2018

A Taxonomy for Neural Memory Networks

In this paper, a taxonomy for memory networks is proposed based on their...
research
11/02/2018

On Evaluating the Generalization of LSTM Models in Formal Languages

Recurrent Neural Networks (RNNs) are theoretically Turing-complete and e...

Please sign up or login with your details

Forgot password? Click here to reset