Learn to Synchronize, Synchronize to Learn

10/06/2020
by   Pietro Verzelli, et al.
0

In recent years, the machine learning community has seen a continuous growing interest in research aimed at investigating dynamical aspects of both training procedures and perfected models. Of particular interest among recurrent neural networks, we have the Reservoir Computing (RC) paradigm for its conceptual simplicity and fast training scheme. Yet, the guiding principles under which RC operates are only partially understood. In this work, we study the properties behind learning dynamical systems with RC and propose a new guiding principle based on Generalized Synchronization (GS) granting its feasibility. We show that the well-known Echo State Property (ESP) implies and is implied by GS, so that theoretical results derived from the ESP still hold when GS does. However, by using GS one can profitably study the RC learning procedure by linking the reservoir dynamics with the readout training. Notably, this allows us to shed light on the interplay between the input encoding performed by the reservoir and the output produced by the readout optimized for the task at hand. In addition, we show that - as opposed to the ESP - satisfaction of the GS can be measured by means of the Mutual False Nearest Neighbors index, which makes effective to practitioners theoretical derivations.

READ FULL TEXT
research
03/24/2020

Input representation in recurrent neural networks dynamics

Reservoir computing is a popular approach to design recurrent neural net...
research
07/28/2022

Learning unseen coexisting attractors

Reservoir computing is a machine learning approach that can generate a s...
research
08/21/2021

Reservoir Computing with Diverse Timescales for Prediction of Multiscale Dynamics

Machine learning approaches have recently been leveraged as a substitute...
research
08/11/2021

Learning strange attractors with reservoir systems

This paper shows that the celebrated Embedding Theorem of Takens is a pa...
research
06/27/2022

Continual Learning of Dynamical Systems with Competitive Federated Reservoir Computing

Machine learning recently proved efficient in learning differential equa...
research
04/24/2023

Constraining Chaos: Enforcing dynamical invariants in the training of recurrent neural networks

Drawing on ergodic theory, we introduce a novel training method for mach...
research
12/14/2020

At the Intersection of Deep Sequential Model Framework and State-space Model Framework: Study on Option Pricing

Inference and forecast problems of the nonlinear dynamical system have a...

Please sign up or login with your details

Forgot password? Click here to reset