Reservoir-size dependent learning in analogue neural networks

07/23/2019
by   Xavier Porte, et al.
0

The implementation of artificial neural networks in hardware substrates is a major interdisciplinary enterprise. Well suited candidates for physical implementations must combine nonlinear neurons with dedicated and efficient hardware solutions for both connectivity and training. Reservoir computing addresses the problems related with the network connectivity and training in an elegant and efficient way. However, important questions regarding impact of reservoir size and learning routines on the convergence-speed during learning remain unaddressed. Here, we study in detail the learning process of a recently demonstrated photonic neural network based on a reservoir. We use a greedy algorithm to train our neural network for the task of chaotic signals prediction and analyze the learning-error landscape. Our results unveil fundamental properties of the system's optimization hyperspace. Particularly, we determine the convergence speed of learning as a function of reservoir size and find exceptional, close to linear scaling. This linear dependence, together with our parallel diffractive coupling, represent optimal scaling conditions for our photonic neural network scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/13/2023

Convergence and scaling of Boolean-weight optimization for hardware reservoirs

Hardware implementation of neural network are an essential step to imple...
research
03/24/2020

Reservoir Computing with Planar Nanomagnet Arrays

Reservoir computing is an emerging methodology for neuromorphic computin...
research
03/27/2020

Boolean learning under noise-perturbations in hardware neural networks

A high efficiency hardware integration of neural networks benefits from ...
research
12/26/2012

Echo State Queueing Network: a new reservoir computing learning tool

In the last decade, a new computational paradigm was introduced in the f...
research
11/11/2022

Re-visiting Reservoir Computing architectures optimized by Evolutionary Algorithms

For many years, Evolutionary Algorithms (EAs) have been applied to impro...
research
05/04/2018

Efficient Design of Hardware-Enabled Recurrent Neural Networks

In this work, we propose a new approach towards the efficient design of ...
research
05/11/2020

Ring Reservoir Neural Networks for Graphs

Machine Learning for graphs is nowadays a research topic of consolidated...

Please sign up or login with your details

Forgot password? Click here to reset