Convergence and scaling of Boolean-weight optimization for hardware reservoirs

05/13/2023
by   Louis Andreoli, et al.
0

Hardware implementation of neural network are an essential step to implement next generation efficient and powerful artificial intelligence solutions. Besides the realization of a parallel, efficient and scalable hardware architecture, the optimization of the system's extremely large parameter space with sampling-efficient approaches is essential. Here, we analytically derive the scaling laws for highly efficient Coordinate Descent applied to optimizing the readout layer of a random recurrently connection neural network, a reservoir. We demonstrate that the convergence is exponential and scales linear with the network's number of neurons. Our results perfectly reproduce the convergence and scaling of a large-scale photonic reservoir implemented in a proof-of-concept experiment. Our work therefore provides a solid foundation for such optimization in hardware networks, and identifies future directions that are promising for optimizing convergence speed during learning leveraging measures of a neural network's amplitude statistics and the weight update rule.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2019

Reservoir-size dependent learning in analogue neural networks

The implementation of artificial neural networks in hardware substrates ...
research
05/04/2018

Efficient Design of Hardware-Enabled Recurrent Neural Networks

In this work, we propose a new approach towards the efficient design of ...
research
11/09/2020

Improving Neural Network Training in Low Dimensional Random Bases

Stochastic Gradient Descent (SGD) has proven to be remarkably effective ...
research
01/01/2019

A Hardware Friendly Unsupervised Memristive Neural Network with Weight Sharing Mechanism

Memristive neural networks (MNNs), which use memristors as neurons or sy...
research
04/20/2022

Noise mitigation strategies in physical feedforward neural networks

Physical neural networks are promising candidates for next generation ar...
research
03/27/2020

Boolean learning under noise-perturbations in hardware neural networks

A high efficiency hardware integration of neural networks benefits from ...

Please sign up or login with your details

Forgot password? Click here to reset