Towards truly local gradients with CLAPP: Contrastive, Local And Predictive Plasticity

10/16/2020
by   Bernd Illing, et al.
0

Back-propagation (BP) is costly to implement in hardware and implausible as a learning rule implemented in the brain. However, BP is surprisingly successful in explaining neuronal activity patterns found along the cortical processing stream. We propose a locally implementable, unsupervised learning algorithm, CLAPP, which minimizes a simple, layer-specific loss function, and thus does not need to back-propagate error signals. The weight updates only depend on state variables of the pre- and post-synaptic neurons and a layer-wide third factor. Networks trained with CLAPP build deep hierarchical representations of images and speech.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/23/2021

Scaling up learning with GAIT-prop

Backpropagation of error (BP) is a widely used and highly successful lea...
research
12/09/2016

Towards deep learning with spiking neurons in energy based models with contrastive Hebbian plasticity

In machine learning, error back-propagation in multi-layer neural networ...
research
06/25/2020

A Theoretical Framework for Target Propagation

The success of deep learning, a brain-inspired form of AI, has sparked i...
research
12/16/2016

Neuromorphic Deep Learning Machines

An ongoing challenge in neuromorphic computing is to devise general and ...
research
06/26/2018

Unsupervised Learning by Competing Hidden Units

It is widely believed that the backpropagation algorithm is essential fo...
research
05/29/2023

Understanding Predictive Coding as an Adaptive Trust-Region Method

Predictive coding (PC) is a brain-inspired local learning algorithm that...
research
06/10/2021

Front Contribution instead of Back Propagation

Deep Learning's outstanding track record across several domains has stem...

Please sign up or login with your details

Forgot password? Click here to reset