Hierarchical Architectures in Reservoir Computing Systems

05/14/2021
by   John Moon, et al.
0

Reservoir computing (RC) offers efficient temporal data processing with a low training cost by separating recurrent neural networks into a fixed network with recurrent connections and a trainable linear network. The quality of the fixed network, called reservoir, is the most important factor that determines the performance of the RC system. In this paper, we investigate the influence of the hierarchical reservoir structure on the properties of the reservoir and the performance of the RC system. Analogous to deep neural networks, stacking sub-reservoirs in series is an efficient way to enhance the nonlinearity of data transformation to high-dimensional space and expand the diversity of temporal information captured by the reservoir. These deep reservoir systems offer better performance when compared to simply increasing the size of the reservoir or the number of sub-reservoirs. Low frequency components are mainly captured by the sub-reservoirs in later stage of the deep reservoir structure, similar to observations that more abstract information can be extracted by layers in the late stage of deep neural networks. When the total size of the reservoir is fixed, tradeoff between the number of sub-reservoirs and the size of each sub-reservoir needs to be carefully considered, due to the degraded ability of individual sub-reservoirs at small sizes. Improved performance of the deep reservoir structure alleviates the difficulty of implementing the RC system on hardware systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2020

Reservoir Computing meets Recurrent Kernels and Structured Transforms

Reservoir Computing is a class of simple yet efficient Recurrent Neural ...
research
07/25/2018

Pre-trainable Reservoir Computing with Recursive Neural Gas

Echo State Networks (ESN) are a class of Recurrent Neural Networks (RNN)...
research
07/08/2017

Tailoring Artificial Neural Networks for Optimal Learning

As one of the most important paradigms of recurrent neural networks, the...
research
09/29/2017

Reservoir Computing using Stochastic p-Bits

We present a general hardware framework for building networks that direc...
research
05/08/2019

Evaluating the Stability of Recurrent Neural Models during Training with Eigenvalue Spectra Analysis

We analyze the stability of recurrent networks, specifically, reservoir ...
research
08/25/2020

Parallel photonic reservoir computing based on frequency multiplexing of neurons

Photonic implementations of reservoir computing can achieve state-of-the...
research
07/17/2023

Reducing hyperparameter dependence by external timescale tailoring

Task specific hyperparameter tuning in reservoir computing is an open is...

Please sign up or login with your details

Forgot password? Click here to reset