Forecasting Using Reservoir Computing: The Role of Generalized Synchronization

02/04/2021
by   Jason A. Platt, et al.
0

Reservoir computers (RC) are a form of recurrent neural network (RNN) used for forecasting time series data. As with all RNNs, selecting the hyperparameters presents a challenge when training on new inputs. We present a method based on generalized synchronization (GS) that gives direction in designing and evaluating the architecture and hyperparameters of a RC. The 'auxiliary method' for detecting GS provides a pre-training test that guides hyperparameter selection. Furthermore, we provide a metric for a "well trained" RC using the reproduction of the input system's Lyapunov exponents.

READ FULL TEXT
research
07/28/2023

A Distance Correlation-Based Approach to Characterize the Effectiveness of Recurrent Neural Networks for Time Series Forecasting

Time series forecasting has received a lot of attention with recurrent n...
research
01/23/2023

Learning Reservoir Dynamics with Temporal Self-Modulation

Reservoir computing (RC) can efficiently process time-series data by tra...
research
04/11/2022

Lyapunov-Guided Embedding for Hyperparameter Selection in Recurrent Neural Networks

Recurrent Neural Networks (RNN) are ubiquitous computing systems for seq...
research
04/08/2020

Reservoir Computing using High Order Synchronization of Coupled Oscillators

We propose a concept for reservoir computing on oscillators using the hi...
research
08/27/2021

Parallel Machine Learning for Forecasting the Dynamics of Complex Networks

Forecasting the dynamics of large complex networks from previous time-se...
research
07/16/2022

Hyperparameter Tuning in Echo State Networks

Echo State Networks represent a type of recurrent neural network with a ...

Please sign up or login with your details

Forgot password? Click here to reset