Deep-ESN: A Multiple Projection-encoding Hierarchical Reservoir Computing Framework

11/13/2017
by   Qianli Ma, et al.
0

As an efficient recurrent neural network (RNN) model, reservoir computing (RC) models, such as Echo State Networks, have attracted widespread attention in the last decade. However, while they have had great success with time series data [1], [2], many time series have a multiscale structure, which a single-hidden-layer RC model may have difficulty capturing. In this paper, we propose a novel hierarchical reservoir computing framework we call Deep Echo State Networks (Deep-ESNs). The most distinctive feature of a Deep-ESN is its ability to deal with time series through hierarchical projections. Specifically, when an input time series is projected into the high-dimensional echo-state space of a reservoir, a subsequent encoding layer (e.g., a PCA, autoencoder, or a random projection) can project the echo-state representations into a lower-dimensional space. These low-dimensional representations can then be processed by another ESN. By using projection layers and encoding layers alternately in the hierarchical framework, a Deep-ESN can not only attenuate the effects of the collinearity problem in ESNs, but also fully take advantage of the temporal kernel property of ESNs to explore multiscale dynamics of time series. To fuse the multiscale representations obtained by each reservoir, we add connections from each encoding layer to the last output layer. Theoretical analyses prove that stability of a Deep-ESN is guaranteed by the echo state property (ESP), and the time complexity is equivalent to a conventional ESN. Experimental results on some artificial and real world time series demonstrate that Deep-ESNs can capture multiscale dynamics, and outperform both standard ESNs and previous hierarchical ESN-based models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/13/2020

Deep Reservoir Networks with Learned Hidden Reservoir Weights using Direct Feedback Alignment

Deep Reservoir Computing has emerged as a new paradigm for deep learning...
research
06/12/2020

Reservoir Computing meets Recurrent Kernels and Structured Transforms

Reservoir Computing is a class of simple yet efficient Recurrent Neural ...
research
07/15/2019

Dynamical Systems as Temporal Feature Spaces

Parameterized state space models in the form of recurrent networks are o...
research
07/01/2019

Analysis of Wide and Deep Echo State Networks for Multiscale Spatiotemporal Time Series Forecasting

Echo state networks are computationally lightweight reservoir models ins...
research
01/23/2022

Imposing Connectome-Derived Topology on an Echo State Network

Can connectome-derived constraints inform computation? In this paper we ...
research
10/27/2020

Hybrid Backpropagation Parallel Reservoir Networks

In many real-world applications, fully-differentiable RNNs such as LSTMs...
research
07/18/2019

Convolutional Reservoir Computing for World Models

Recently, reinforcement learning models have achieved great success, com...

Please sign up or login with your details

Forgot password? Click here to reset