Approximation Bounds for Random Neural Networks and Reservoir Systems

02/14/2020
by   Lukas Gonon, et al.
0

This work studies approximation based on single-hidden-layer feedforward and recurrent neural networks with randomly generated internal weights. These methods, in which only the last layer of weights and a few hyperparameters are optimized, have been successfully applied in a wide range of static and dynamic learning problems. Despite the popularity of this approach in empirical tasks, important theoretical questions regarding the relation between the unknown function, the weight distribution, and the approximation rate have remained open. In this work it is proved that, as long as the unknown function, functional, or dynamical system is sufficiently regular, it is possible to draw the internal weights of the random (recurrent) neural network from a generic distribution (not depending on the unknown object) and quantify the error in terms of the number of neurons and the hyperparameters. In particular, this proves that echo state networks with randomly generated weights are capable of approximating a wide class of dynamical systems arbitrarily well and thus provides the first mathematical explanation for their empirically observed success at learning dynamical systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/12/2022

Universality and approximation bounds for echo state networks with random weights

We study the uniform approximation of echo state networks with randomly ...
research
05/14/2020

Echo State Networks trained by Tikhonov least squares are L2(μ) approximators of ergodic dynamical systems

Echo State Networks (ESNs) are a class of single-layer recurrent neural ...
research
05/06/2021

Metric Entropy Limits on Recurrent Neural Network Learning of Linear Dynamical Systems

One of the most influential results in neural network theory is the univ...
research
11/28/2012

Nature-Inspired Mateheuristic Algorithms: Success and New Challenges

Despite the increasing popularity of metaheuristics, many crucially impo...
research
09/10/2016

Multiplex visibility graphs to investigate recurrent neural networks dynamics

A recurrent neural network (RNN) is a universal approximator of dynamica...
research
04/02/2023

Infinite-dimensional reservoir computing

Reservoir computing approximation and generalization bounds are proved f...
research
03/08/2021

Cluster-based Input Weight Initialization for Echo State Networks

Echo State Networks (ESNs) are a special type of recurrent neural networ...

Please sign up or login with your details

Forgot password? Click here to reset