Multi-layer Hebbian networks with modern deep learning frameworks

07/04/2021
by   Thomas Miconi, et al.
19

Deep learning networks generally use non-biological learning methods. By contrast, networks based on more biologically plausible learning, such as Hebbian learning, show comparatively poor performance and difficulties of implementation. Here we show that hierarchical, convolutional Hebbian learning can be implemented almost trivially with modern deep learning frameworks, by using specific losses whose gradients produce exactly the desired Hebbian updates. We provide expressions whose gradients exactly implement a plain Hebbian rule (dw  = xy), Grossberg's instar rule (dw  = y(x-w)), and Oja's rule (dw  = y(x-yw)). As an application, we build Hebbian convolutional multi-layer networks for object recognition. We observe that higher layers of such networks tend to learn large, simple features (Gabor-like filters and blobs), explaining the previously reported decrease in decoding performance over successive layers. To combat this tendency, we introduce interventions (denser activations with sparse plasticity, pruning of connections between layers) which result in sparser learned features, massively increase performance, and allow information to increase over successive layers. We hypothesize that more advanced techniques (dynamic stimuli, trace learning, feedback connections, etc.), together with the massive computational boost offered by modern deep learning frameworks, could greatly improve the performance and biological relevance of multi-layer Hebbian networks.

READ FULL TEXT

page 5

page 6

page 9

page 10

research
02/27/2019

Biologically plausible deep learning -- but how far can we go with shallow networks?

Training deep neural networks with the error backpropagation algorithm i...
research
11/19/2018

Biologically plausible deep learning

Building on the model proposed in Lillicrap et. al. we show that deep ne...
research
11/14/2021

A layer-stress learning framework universally augments deep neural network tasks

Deep neural networks (DNN) such as Multi-Layer Perception (MLP) and Conv...
research
03/01/2010

Deep Big Simple Neural Nets Excel on Handwritten Digit Recognition

Good old on-line back-propagation for plain multi-layer perceptrons yiel...
research
05/23/2020

Misalignment Resilient Diffractive Optical Networks

As an optical machine learning framework, Diffractive Deep Neural Networ...
research
12/12/2022

A Neural ODE Interpretation of Transformer Layers

Transformer layers, which use an alternating pattern of multi-head atten...
research
10/26/2022

Multi-level Data Representation For Training Deep Helmholtz Machines

A vast majority of the current research in the field of Machine Learning...

Please sign up or login with your details

Forgot password? Click here to reset