Simple Cycle Reservoirs are Universal

08/21/2023
by   Boyu Li, et al.
0

Reservoir computation models form a subclass of recurrent neural networks with fixed non-trainable input and dynamic coupling weights. Only the static readout from the state space (reservoir) is trainable, thus avoiding the known problems with propagation of gradient information backwards through time. Reservoir models have been successfully applied in a variety of tasks and were shown to be universal approximators of time-invariant fading memory dynamic filters under various settings. Simple cycle reservoirs (SCR) have been suggested as severely restricted reservoir architecture, with equal weight ring connectivity of the reservoir units and input-to-reservoir weights of binary nature with the same absolute value. Such architectures are well suited for hardware implementations without performance degradation in many practical tasks. In this contribution, we rigorously study the expressive power of SCR in the complex domain and show that they are capable of universal approximation of any unrestricted linear reservoir system (with continuous readout) and hence any time-invariant fading memory filter over uniformly bounded input streams.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/17/2020

Discrete-time signatures and randomness in reservoir computing

A new explanation of geometric nature of the reservoir computing phenome...
research
12/30/2022

Reservoir kernels and Volterra series

A universal kernel is constructed whose sections approximate any causal ...
research
02/28/2018

A theory of sequence indexing and working memory in recurrent neural networks

To accommodate structured approaches of neural computation, we propose a...
research
05/23/2018

Concentric ESN: Assessing the Effect of Modularity in Cycle Reservoirs

The paper introduces concentric Echo State Network, an approach to desig...
research
07/25/2018

Pre-trainable Reservoir Computing with Recursive Neural Gas

Echo State Networks (ESN) are a class of Recurrent Neural Networks (RNN)...
research
08/04/2023

Universal Approximation of Linear Time-Invariant (LTI) Systems through RNNs: Power of Randomness in Reservoir Computing

Recurrent neural networks (RNNs) are known to be universal approximators...
research
07/15/2019

Dynamical Systems as Temporal Feature Spaces

Parameterized state space models in the form of recurrent networks are o...

Please sign up or login with your details

Forgot password? Click here to reset