Training Neural Networks with Local Error Signals

01/20/2019
by   Arild Nøkland, et al.
0

Supervised training of neural networks for classification is typically performed with a global loss function. The loss function provides a gradient for the output layer, and this gradient is back-propagated to hidden layers to dictate an update direction for the weights. An alternative approach is to train the network with layer-wise loss functions. In this paper we demonstrate, for the first time, that layer-wise training can approach the state-of-the-art on a variety of image datasets. We use single-layer sub-networks and two different supervised loss functions to generate local error signals for the hidden layers, and we show that the combination of these losses help with optimization in the context of local learning. Using local errors could be a step towards more biologically plausible deep learning because the global error does not have to be transported back to hidden layers. A completely backprop free variant outperforms previously reported results among methods aiming for higher biological plausibility. Code is available https://github.com/anokland/local-loss

READ FULL TEXT
research
10/30/2017

Critical Points of Neural Networks: Analytical Forms and Landscape Properties

Due to the success of deep learning to solving a variety of challenging ...
research
06/27/2023

One-class systems seamlessly fit in the forward-forward algorithm

The forward-forward algorithm presents a new method of training neural n...
research
08/01/2022

Locally Supervised Learning with Periodic Global Guidance

Locally supervised learning aims to train a neural network based on a lo...
research
02/17/2022

General Cyclical Training of Neural Networks

This paper describes the principle of "General Cyclical Training" in mac...
research
10/07/2022

Scaling Forward Gradient With Local Losses

Forward gradient learning computes a noisy directional gradient and is a...
research
10/09/2018

Learning One-hidden-layer Neural Networks under General Input Distributions

Significant advances have been made recently on training neural networks...
research
02/06/2019

The role of a layer in deep neural networks: a Gaussian Process perspective

A fundamental question in deep learning concerns the role played by indi...

Please sign up or login with your details

Forgot password? Click here to reset