A Theoretical Framework for Target Propagation

06/25/2020
by   Alexander Meulemans, et al.
8

The success of deep learning, a brain-inspired form of AI, has sparked interest in understanding how the brain could similarly learn across multiple layers of neurons. However, the majority of biologically-plausible learning algorithms have not yet reached the performance of backpropagation (BP), nor are they built on strong theoretical foundations. Here, we analyze target propagation (TP), a popular but not yet fully understood alternative to BP, from the standpoint of mathematical optimization. Our theory shows that TP is closely related to Gauss-Newton optimization and thus substantially differs from BP. Furthermore, our analysis reveals a fundamental limitation of difference target propagation (DTP), a well-known variant of TP, in the realistic scenario of non-invertible neural networks. We provide a first solution to this problem through a novel reconstruction loss that improves feedback weight training, while simultaneously introducing architectural flexibility by allowing for direct feedback connections from the output to each hidden layer. Our theory is corroborated by experimental results that show significant improvements in performance and in the alignment of forward weight updates with loss gradients, compared to DTP.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/31/2022

Towards Scaling Difference Target Propagation by Learning Backprop Targets

The development of biologically-plausible learning algorithms is importa...
07/12/2018

Assessing the Scalability of Biologically-Motivated Deep Learning Algorithms and Architectures

The backpropagation of error algorithm (BP) is often said to be impossib...
02/23/2017

Bidirectional Backpropagation: Towards Biologically Plausible Error Signal Transmission in Neural Networks

The back-propagation (BP) algorithm has been considered the de-facto met...
06/23/2020

Extension of Direct Feedback Alignment to Convolutional and Recurrent Neural Network for Bio-plausible Deep Learning

Throughout this paper, we focus on the improvement of the direct feedbac...
10/16/2020

Towards truly local gradients with CLAPP: Contrastive, Local And Predictive Plasticity

Back-propagation (BP) is costly to implement in hardware and implausible...
02/01/2022

Deep Layer-wise Networks Have Closed-Form Weights

There is currently a debate within the neuroscience community over the l...
06/01/2022

A Theoretical Framework for Inference Learning

Backpropagation (BP) is the most successful and widely used algorithm in...

Code Repositories

theoretical_framework_for_target_propagation

Python implementation of the methods in Meulemans et al. 2020 - A Theoretical Framework For Target Propagation


view repo