A Theory of Local Learning, the Learning Channel, and the Optimality of Backpropagation

06/22/2015
by   Pierre Baldi, et al.
0

In a physical neural system, where storage and processing are intimately intertwined, the rules for adjusting the synaptic weights can only depend on variables that are available locally, such as the activity of the pre- and post-synaptic neurons, resulting in local learning rules. A systematic framework for studying the space of local learning rules is obtained by first specifying the nature of the local variables, and then the functional form that ties them together into each learning rule. Such a framework enables also the systematic discovery of new learning rules and exploration of relationships between learning rules and group symmetries. We study polynomial local learning rules stratified by their degree and analyze their behavior and capabilities in both linear and non-linear units and networks. Stacking local learning rules in deep feedforward networks leads to deep local learning. While deep local learning can learn interesting representations, it cannot learn complex input-output functions, even when targets are available for the top layer. Learning complex input-output functions requires local deep learning where target information is communicated to the deep layers through a backward learning channel. The nature of the communicated information about the targets and the structure of the learning channel partition the space of learning algorithms. We estimate the learning channel capacity associated with several algorithms and show that backpropagation outperforms them by simultaneously maximizing the information rate and minimizing the computational cost, even in recurrent networks. The theory clarifies the concept of Hebbian learning, establishes the power and limitations of local learning rules, introduces the learning channel which enables a formal analysis of the optimality of backpropagation, and explains the sparsity of the space of learning rules discovered so far.

READ FULL TEXT

page 24

page 26

page 27

page 28

research
12/22/2017

Learning in the Machine: the Symmetries of the Deep Learning Channel

In a physical neural system, learning rules must be local both in space ...
research
01/31/2021

PyTorch-Hebbian: facilitating local learning in a deep learning framework

Recently, unsupervised local learning, based on Hebb's idea that change ...
research
02/19/2021

A theory of capacity and sparse neural encoding

Motivated by biological considerations, we study sparse neural maps from...
research
10/22/2020

Identifying Learning Rules From Neural Network Observables

The brain modifies its synaptic strengths during learning in order to be...
research
06/28/2022

Learning Symmetric Rules with SATNet

SATNet is a differentiable constraint solver with a custom backpropagati...
research
12/30/2017

Dendritic error backpropagation in deep cortical microcircuits

Animal behaviour depends on learning to associate sensory stimuli with t...
research
06/12/2019

Action-Sensitive Phonological Dependencies

This paper defines a subregular class of functions called the tier-based...

Please sign up or login with your details

Forgot password? Click here to reset