Empirical Analysis of Limits for Memory Distance in Recurrent Neural Networks

12/20/2022
by   Steffen Illium, et al.
0

Common to all different kinds of recurrent neural networks (RNNs) is the intention to model relations between data points through time. When there is no immediate relationship between subsequent data points (like when the data points are generated at random, e.g.), we show that RNNs are still able to remember a few data points back into the sequence by memorizing them by heart using standard backpropagation. However, we also show that for classical RNNs, LSTM and GRU networks the distance of data points between recurrent calls that can be reproduced this way is highly limited (compared to even a loose connection between data points) and subject to various constraints imposed by the type and size of the RNN in question. This implies the existence of a hard limit (way below the information-theoretic one) for the distance between related data points within which RNNs are still able to recognize said relation.

READ FULL TEXT

page 6

page 7

research
03/17/2021

Z Distance Function for KNN Classification

This paper proposes a new distance metric function, called Z distance, f...
research
09/18/2019

Alleviating Sequence Information Loss with Data Overlapping and Prime Batch Sizes

In sequence modeling tasks the token order matters, but this information...
research
01/25/2018

Abnormal Heartbeat Detection Using Recurrent Neural Networks

The observation and management of cardiac features (using automated card...
research
09/08/2022

ReX: A Framework for Generating Local Explanations to Recurrent Neural Networks

We propose a general framework to adapt various local explanation techni...
research
01/03/2021

Recoding latent sentence representations – Dynamic gradient-based activation modification in RNNs

In Recurrent Neural Networks (RNNs), encoding information in a suboptima...
research
05/31/2022

Exact Feature Collisions in Neural Networks

Predictions made by deep neural networks were shown to be highly sensiti...
research
10/11/2022

On Scrambling Phenomena for Randomly Initialized Recurrent Networks

Recurrent Neural Networks (RNNs) frequently exhibit complicated dynamics...

Please sign up or login with your details

Forgot password? Click here to reset