Reservoir Computers Modal Decomposition and Optimization

01/13/2021
by   Chad Nathe, et al.
0

The topology of a network associated with a reservoir computer is often taken so that the connectivity and the weights are chosen randomly. Optimization is hardly considered as the parameter space is typically too large. Here we investigate this problem for a class of reservoir computers for which we obtain a decomposition of the reservoir dynamics into modes, which can be computed independently of one another. Each mode depends on an eigenvalue of the network adjacency matrix. We then take a parametric approach in which the eigenvalues are parameters that can be appropriately designed and optimized. In addition, we introduce the application of a time shift to each individual mode. We show that manipulations of the individual modes, either in terms of the eigenvalues or the time shifts, can lead to dramatic reductions in the training error.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/01/2019

Forecasting Chaotic Systems with Very Low Connectivity Reservoir Computers

We explore the hyperparameter space of reservoir computers used for fore...
research
02/08/2018

Using a reservoir computer to learn chaotic attractors, with applications to chaos synchronisation and cryptography

Using the machine learning approach known as reservoir computing, it is ...
research
11/29/2022

Optimizing time-shifts for reservoir computing using a rank-revealing QR algorithm

Reservoir computing is a recurrent neural network paradigm in which only...
research
09/16/2020

Improving Delay Based Reservoir Computing via Eigenvalue Analysis

We analyze the reservoir computation capability of the Lang-Kobayashi sy...
research
11/15/2020

Transfer learning of chaotic systems

Can a neural network trained by the time series of system A be used to p...
research
08/24/2020

Adding Filters to Improve Reservoir Computer Performance

Reservoir computers are a type of neuromorphic computer that may be buil...
research
05/08/2019

Evaluating the Stability of Recurrent Neural Models during Training with Eigenvalue Spectra Analysis

We analyze the stability of recurrent networks, specifically, reservoir ...

Please sign up or login with your details

Forgot password? Click here to reset