Holomorphic Equilibrium Propagation Computes Exact Gradients Through Finite Size Oscillations

by   Axel Laborieux, et al.

Equilibrium propagation (EP) is an alternative to backpropagation (BP) that allows the training of deep neural networks with local learning rules. It thus provides a compelling framework for training neuromorphic systems and understanding learning in neurobiology. However, EP requires infinitesimal teaching signals, thereby limiting its applicability in noisy physical systems. Moreover, the algorithm requires separate temporal phases and has not been applied to large-scale problems. Here we address these issues by extending EP to holomorphic networks. We show analytically that this extension naturally leads to exact gradients even for finite-amplitude teaching signals. Importantly, the gradient can be computed as the first Fourier coefficient from finite neuronal activity oscillations in continuous time without requiring separate phases. Further, we demonstrate in numerical simulations that our approach permits robust estimation of gradients in the presence of noise and that deeper models benefit from the finite teaching signals. Finally, we establish the first benchmark for EP on the ImageNet 32x32 dataset and show that it matches the performance of an equivalent network trained with BP. Our work provides analytical insights that enable scaling EP to large-scale problems and establishes a formal framework for how oscillations could support learning in biological and neuromorphic systems.


Improving equilibrium propagation without weight symmetry through Jacobian homeostasis

Equilibrium propagation (EP) is a compelling alternative to the backprop...

A deep learning theory for neural networks grounded in physics

In the last decade, deep learning has become a major component of artifi...

Sequence Learning using Equilibrium Propagation

Equilibrium Propagation (EP) is a powerful and more bio-plausible altern...

Continual Weight Updates and Convolutional Architectures for Equilibrium Propagation

Equilibrium Propagation (EP) is a biologically inspired alternative algo...

Activation Relaxation: A Local Dynamical Approximation to Backpropagation in the Brain

The backpropagation of error algorithm (backprop) has been instrumental ...

Teaching a neural network with non-tunable exciton-polariton nodes

In contrast to software simulations of neural networks, hardware or neur...

Please sign up or login with your details

Forgot password? Click here to reset