Bayesian optimisation of large-scale photonic reservoir computers

04/06/2020
by   Piotr Antonik, et al.
0

Introduction. Reservoir computing is a growing paradigm for simplified training of recurrent neural networks, with a high potential for hardware implementations. Numerous experiments in optics and electronics yield comparable performance to digital state-of-the-art algorithms. Many of the most recent works in the field focus on large-scale photonic systems, with tens of thousands of physical nodes and arbitrary interconnections. While this trend significantly expands the potential applications of photonic reservoir computing, it also complicates the optimisation of the high number of hyper-parameters of the system. Methods. In this work, we propose the use of Bayesian optimisation for efficient exploration of the hyper-parameter space in a minimum number of iteration. Results. We test this approach on a previously reported large-scale experimental system, compare it to the commonly used grid search, and report notable improvements in performance and the number of experimental iterations required to optimise the hyper-parameters. Conclusion. Bayesian optimisation thus has the potential to become the standard method for tuning the hyper-parameters in photonic reservoir computing.

READ FULL TEXT
research
12/19/2020

Online training for high-performance analogue readout layers in photonic reservoir computers

Introduction. Reservoir Computing is a bio-inspired computing paradigm f...
research
07/12/2022

RcTorch: a PyTorch Reservoir Computing Package with Automated Hyper-Parameter Optimization

Reservoir computers (RCs) are among the fastest to train of all neural n...
research
09/14/2012

Analog readout for optical reservoir computers

Reservoir computing is a new, powerful and flexible machine learning tec...
research
03/27/2019

Echo State Networks with Self-Normalizing Activations on the Hyper-Sphere

Among the various architectures of Recurrent Neural Networks, Echo State...
research
10/03/2018

A characterization of the Edge of Criticality in Binary Echo State Networks

Echo State Networks (ESNs) are simplified recurrent neural network model...
research
04/13/2021

Bayesian Optimisation for a Biologically Inspired Population Neural Network

We have used Bayesian Optimisation (BO) to find hyper-parameters in an e...
research
09/13/2022

Pareto Driven Surrogate (ParDen-Sur) Assisted Optimisation of Multi-period Portfolio Backtest Simulations

Portfolio management is a multi-period multi-objective optimisation prob...

Please sign up or login with your details

Forgot password? Click here to reset