Two Routes to Scalable Credit Assignment without Weight Symmetry

02/28/2020
by   Daniel Kunin, et al.
5

The neural plausibility of backpropagation has long been disputed, primarily for its use of non-local weight transport - the biologically dubious requirement that one neuron instantaneously measure the synaptic weights of another. Until recently, attempts to create local learning rules that avoid weight transport have typically failed in the large-scale learning scenarios where backpropagation shines, e.g. ImageNet categorization with deep convolutional networks. Here, we investigate a recently proposed local learning rule that yields competitive performance with backpropagation and find that it is highly sensitive to metaparameter choices, requiring laborious tuning that does not transfer across network architecture. Our analysis indicates the underlying mathematical reason for this instability, allowing us to identify a more robust local learning rule that better transfers without metaparameter tuning. Nonetheless, we find a performance and stability gap between this local rule and backpropagation that widens with increasing model depth. We then investigate several non-local learning rules that relax the need for instantaneous weight transport into a more biologically-plausible "weight estimation" process, showing that these rules match state-of-the-art performance on deep networks and operate effectively in the presence of noisy updates. Taken together, our results suggest two routes towards the discovery of neural implementations for credit assignment without weight symmetry: further improvement of local rules so that they perform consistently across architectures and the identification of biological implementations for non-local learning mechanisms.

READ FULL TEXT

page 9

page 16

page 17

research
06/16/2020

Learning to Learn with Feedback and Local Plasticity

Interest in biologically inspired alternatives to backpropagation is dri...
research
12/12/2018

Feedback alignment in deep convolutional networks

Ongoing studies have identified similarities between neural representati...
research
06/15/2021

Credit Assignment in Neural Networks through Deep Feedback Control

The success of deep learning sparked interest in whether the brain learn...
research
12/01/2022

Synaptic Dynamics Realize First-order Adaptive Learning and Weight Symmetry

Gradient-based first-order adaptive optimization methods such as the Ada...
research
12/08/2021

Symmetry Perception by Deep Networks: Inadequacy of Feed-Forward Architectures and Improvements with Recurrent Connections

Symmetry is omnipresent in nature and perceived by the visual system of ...
research
10/13/2020

Investigating the Scalability and Biological Plausibility of the Activation Relaxation Algorithm

The recently proposed Activation Relaxation (AR) algorithm provides a si...
research
11/21/2022

Learning on tree architectures outperforms a convolutional feedforward network

Advanced deep learning architectures consist of tens of fully connected ...

Please sign up or login with your details

Forgot password? Click here to reset