Short-term Memory of Deep RNN

02/02/2018
by   Claudio Gallicchio, et al.
0

The extension of deep learning towards temporal data processing is gaining an increasing research interest. In this paper we investigate the properties of state dynamics developed in successive levels of deep recurrent neural networks (RNNs) in terms of short-term memory abilities. Our results reveal interesting insights that shed light on the nature of layering as a factor of RNN design. Noticeably, higher layers in a hierarchically organized RNN architecture results to be inherently biased towards longer memory spans even prior to training of the recurrent connections. Moreover, in the context of Reservoir Computing framework, our analysis also points out the benefit of a layered recurrent organization as an efficient approach to improve the memory skills of reservoir models.

READ FULL TEXT
research
12/12/2017

Deep Echo State Network (DeepESN): A Brief Survey

The study of deep recurrent neural networks (RNNs) and, in particular, o...
research
06/04/2020

Sparsity in Reservoir Computing Neural Networks

Reservoir Computing (RC) is a well-known strategy for designing Recurren...
research
03/12/2019

Richness of Deep Echo State Network Dynamics

Reservoir Computing (RC) is a popular methodology for the efficient desi...
research
02/14/2018

Use of recurrent infomax to improve the memory capability of input-driven recurrent neural networks

The inherent transient dynamics of recurrent neural networks (RNNs) have...
research
12/02/2016

Parameter Compression of Recurrent Neural Networks and Degradation of Short-term Memory

The significant computational costs of deploying neural networks in larg...
research
03/03/2022

Deep Q-network using reservoir computing with multi-layered readout

Recurrent neural network (RNN) based reinforcement learning (RL) is used...

Please sign up or login with your details

Forgot password? Click here to reset