Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

05/14/2020
by   Allen G Hart, et al.
0

Echo State Networks (ESNs) are a class of single-layer recurrent neural networks with randomly generated internal weights, and a single layer of tuneable outer weights, which are usually trained by regularised linear least squares regression. Remarkably, ESNs still enjoy the universal approximation property despite the training procedure being entirely linear. In this paper, we prove that an ESN trained on a sequence of scalar observations from an ergodic dynamical system (with invariant measure μ) using Tikhonov least squares will approximate future observations of the dynamical system in the L2(μ) norm. We call this the ESN Training Theorem. We demonstrate the theory numerically by training an ESN using Tikhonov least squares on a sequence of scalar observations of the Lorenz system, and compare the invariant measure of these observations with the invariant measure of the future predictions of the autonomous ESN.

READ FULL TEXT
research
02/14/2020

Approximation Bounds for Random Neural Networks and Reservoir Systems

This work studies approximation based on single-hidden-layer feedforward...
research
06/12/2022

Universality and approximation bounds for echo state networks with random weights

We study the uniform approximation of echo state networks with randomly ...
research
05/06/2021

Metric Entropy Limits on Recurrent Neural Network Learning of Linear Dynamical Systems

One of the most influential results in neural network theory is the univ...
research
03/24/2020

Input representation in recurrent neural networks dynamics

Reservoir computing is a popular approach to design recurrent neural net...
research
09/09/2018

Cellular automata as convolutional neural networks

Deep learning techniques have recently demonstrated broad success in pre...
research
03/13/2017

Comparison of echo state network output layer classification methods on noisy data

Echo state networks are a recently developed type of recurrent neural ne...
research
03/21/2018

Semidefinite Outer Approximation of the Backward Reachable Set of Discrete-time Autonomous Polynomial Systems

We approximate the backward reachable set of discrete-time autonomous po...

Please sign up or login with your details

Forgot password? Click here to reset