Block-local learning with probabilistic latent representations

05/24/2023
by   David Kappel, et al.
0

The ubiquitous backpropagation algorithm requires sequential updates across blocks of a network, introducing a locking problem. Moreover, backpropagation relies on the transpose of weight matrices to calculate updates, introducing a weight transport problem across blocks. Both these issues prevent efficient parallelisation and horizontal scaling of models across devices. We propose a new method that introduces a twin network that propagates information backwards from the targets to the input to provide auxiliary local losses. Forward and backward propagation can work in parallel and with different sets of weights, addressing the problems of weight transport and locking. Our approach derives from a statistical interpretation of end-to-end training which treats activations of network layers as parameters of probability distributions. The resulting learning framework uses these parameters locally to assess the matching between forward and backward information. Error backpropagation is then performed locally within each block, leading to `block-local' learning. Several previously proposed alternatives to error backpropagation emerge as special cases of our model. We present results on various tasks and architectures, including transformers, demonstrating state-of-the-art performance using block-local learning. These results provide a new principled framework to train very large networks in a distributed setting and can also be applied in neuromorphic systems.

READ FULL TEXT

page 3

page 6

research
04/04/2022

Forward Signal Propagation Learning

We propose a new learning algorithm for propagating a learning signal an...
research
12/20/2022

Learning efficient backprojections across cortical hierarchies in real time

Models of sensory processing and learning in the cortex need to efficien...
research
09/03/2019

Learning without feedback: Direct random target projection as a feedback-alignment algorithm with layerwise feedforward training

While the backpropagation of error algorithm allowed for a rapid rise in...
research
06/07/2023

Correlative Information Maximization: A Biologically Plausible Approach to Supervised Deep Neural Networks without Weight Symmetry

The backpropagation algorithm has experienced remarkable success in trai...
research
01/09/2021

Training Deep Architectures Without End-to-End Backpropagation: A Brief Survey

This tutorial paper surveys training alternatives to end-to-end backprop...
research
08/04/2020

LoCo: Local Contrastive Representation Learning

Deep neural nets typically perform end-to-end backpropagation to learn t...
research
06/30/2020

Backpropagation through nonlinear units for all-optical training of neural networks

Backpropagation through nonlinear neurons is an outstanding challenge to...

Please sign up or login with your details

Forgot password? Click here to reset