DeepAI AI Chat
Log In Sign Up

Understanding LSTM – a tutorial into Long Short-Term Memory Recurrent Neural Networks

by   Ralf C. Staudemeyer, et al.
Hochschule Schmalkalden
Singapore University of Technology and Design

Long Short-Term Memory Recurrent Neural Networks (LSTM-RNN) are one of the most powerful dynamic classifiers publicly known. The network itself and the related learning algorithms are reasonably well documented to get an idea how it works. This paper will shed more light into understanding how LSTM-RNNs evolved and why they work impressively well, focusing on the early, ground-breaking publications. We significantly improved documentation and fixed a number of errors and inconsistencies that accumulated in previous publications. To support understanding we as well revised and unified the notation used.


page 1

page 2

page 3

page 4


On Extended Long Short-term Memory and Dependent Bidirectional Recurrent Neural Network

In this work, we investigate the memory capability of recurrent neural n...

How Chaotic Are Recurrent Neural Networks?

Recurrent neural networks (RNNs) are non-linear dynamic systems. Previou...

Quantum Long Short-Term Memory

Long short-term memory (LSTM) is a kind of recurrent neural networks (RN...

Recurrent babbling: evaluating the acquisition of grammar from limited input data

Recurrent Neural Networks (RNNs) have been shown to capture various aspe...

Realization Theory Of Recurrent Neural ODEs Using Polynomial System Embeddings

In this paper we show that neural ODE analogs of recurrent (ODE-RNN) and...

Model Extraction Attacks against Recurrent Neural Networks

Model extraction attacks are a kind of attacks in which an adversary obt...

Predicting Blossom Date of Cherry Tree With Support Vector Machine and Recurrent Neural Network

Our project probes the relationship between temperatures and the blossom...