Simplified Long Short-term Memory Recurrent Neural Networks: part I

07/14/2017
by   Atra Akandeh, et al.
0

We present five variants of the standard Long Short-term Memory (LSTM) recurrent neural networks by uniformly reducing blocks of adaptive parameters in the gating mechanisms. For simplicity, we refer to these models as LSTM1, LSTM2, LSTM3, LSTM4, and LSTM5, respectively. Such parameter-reduced variants enable speeding up data training computations and would be more suitable for implementations onto constrained embedded platforms. We comparatively evaluate and verify our five variant models on the classical MNIST dataset and demonstrate that these variant models are comparable to a standard implementation of the LSTM model while using less number of parameters. Moreover, we observe that in some cases the standard LSTM's accuracy performance will drop after a number of epochs when using the ReLU nonlinearity; in contrast, however, LSTM3, LSTM4 and LSTM5 will retain their performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/14/2017

Simplified Long Short-term Memory Recurrent Neural Networks: part II

This is part II of three-part work. Here, we present a second set of int...
research
01/18/2019

Slim LSTM networks: LSTM_6 and LSTM_C6

We have shown previously that our parameter-reduced variants of Long Sho...
research
01/23/2020

Low-Complexity LSTM Training and Inference with FloatSD8 Weight Representation

The FloatSD technology has been shown to have excellent performance on l...
research
08/11/2015

Benchmarking of LSTM Networks

LSTM (Long Short-Term Memory) recurrent neural networks have been highly...
research
04/08/2021

Predicting Inflation with Neural Networks

This paper applies neural network models to forecast inflation. The use ...
research
07/12/2016

Recurrent Highway Networks

Many sequential processing tasks require complex nonlinear transition fu...
research
01/27/2022

LiteLSTM Architecture for Deep Recurrent Neural Networks

Long short-term memory (LSTM) is a robust recurrent neural network archi...

Please sign up or login with your details

Forgot password? Click here to reset