Conducting Credit Assignment by Aligning Local Representations

03/05/2018
by   Alexander G. Ororbia, et al.
0

The use of back-propagation and its variants to train deep networks is often problematic for new users, with issues such as exploding gradients, vanishing gradients, and high sensitivity to weight initialization strategies often making networks difficult to train. In this paper, we present Local Representation Alignment (LRA), a training procedure that is much less sensitive to bad initializations, does not require modifications to the network architecture, and can be adapted to networks with highly nonlinear and discrete-valued activation functions. Furthermore, we show that one variation of LRA can start with a null initialization of network weights and still successfully train networks with a wide variety of nonlinearities, including tanh, ReLU-6, softplus, signum and others that are more biologically plausible. Experiments on MNIST and Fashion MNIST validate the performance of the algorithm and show that LRA can train networks robustly and effectively, succeeding even when back-propagation fails and outperforming other alternative learning algorithms, such as target propagation and feedback alignment.

READ FULL TEXT

page 14

page 21

research
10/17/2018

Online Learning of Recurrent Neural Architectures by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
10/17/2018

Continual Learning of Recurrent Neural Networks by Locally Aligning Distributed Representations

Temporal models based on recurrent neural networks have proven to be qui...
research
05/26/2018

Biologically Motivated Algorithms for Propagating Local Target Representations

Finding biologically plausible alternatives to back-propagation of error...
research
06/03/2022

A Robust Backpropagation-Free Framework for Images

While current deep learning algorithms have been successful for a wide v...
research
03/19/2018

Deep learning improved by biological activation functions

`Biologically inspired' activation functions, such as the logistic sigmo...
research
12/23/2014

Difference Target Propagation

Back-propagation has been the workhorse of recent successes of deep lear...
research
02/10/2020

Reducing the Computational Burden of Deep Learning with Recursive Local Representation Alignment

Training deep neural networks on large-scale datasets requires significa...

Please sign up or login with your details

Forgot password? Click here to reset