Kernelized information bottleneck leads to biologically plausible 3-factor Hebbian learning in deep networks

06/12/2020
by   Roman Pogodin, et al.
0

The state-of-the art machine learning approach to training deep neural networks, backpropagation, is implausible for real neural networks: neurons need to know their outgoing weights; training alternates between a forward pass (computation) and a backward pass (learning); and the algorithm needs a large amount of labeled data. Biologically plausible approximations to backpropagation, such as feedback alignment, solve the weight transport problem, but not the other two. Thus, fully biologically plausible learning rules have so far remained elusive. Here we present a family of learning rules that does not suffer from any of these problems. It is motivated by the information bottleneck principle (extended with kernel methods), in which networks learn to squeeze as much information as possible out of the input without sacrificing prediction of the output. The resulting rules have a 3-factor Hebbian structure: they require pre- and post-synaptic firing rates and a global error signal - the third factor - that can be supplied by a neuromodulator. Moreover, they do not require precise labels; instead, they rely on the similarity between the desired outputs. They thus solve all three implausibility issues of backpropagation. Moreover, to obtain good performance on hard problems and retain biologically plausible learning rules, our rules need divisive normalization - a known feature of biological networks. Finally, simulations show that our rule performs nearly as well as backpropagation on image classification tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2022

Learning efficient backprojections across cortical hierarchies in real time

Models of sensory processing and learning in the cortex need to efficien...
research
12/04/2019

Are skip connections necessary for biologically plausible learning rules?

Backpropagation is the workhorse of deep learning, however, several othe...
research
08/05/2019

The HSIC Bottleneck: Deep Learning without Back-Propagation

We introduce the HSIC (Hilbert-Schmidt independence criterion) bottlenec...
research
12/10/2019

Backprop Diffusion is Biologically Plausible

The Backpropagation algorithm relies on the abstraction of using a neura...
research
03/26/2023

Lazy learning: a biologically-inspired plasticity rule for fast and energy efficient synaptic plasticity

When training neural networks for classification tasks with backpropagat...
research
10/13/2020

Investigating the Scalability and Biological Plausibility of the Activation Relaxation Algorithm

The recently proposed Activation Relaxation (AR) algorithm provides a si...
research
01/30/2019

Direct Feedback Alignment with Sparse Connections for Local Learning

Recent advances in deep neural networks (DNNs) owe their success to trai...

Please sign up or login with your details

Forgot password? Click here to reset