A Two-Step Rule for Backpropagation

03/17/2023
by   Ahmed Boughammoura, et al.
0

We present a simplified computational rule for the back-propagation formulas for artificial neural networks. In this work, we provide a generic two-step rule for the back-propagation algorithm in matrix notation. Moreover, this rule incorporates both the forward and backward phases of the computations involved in the learning process. Specifically, this recursive computing rule permits the propagation of the changes to all synaptic weights in the network, layer by layer, efficiently. In particular, we use this rule to compute both the up and down partial derivatives of the cost function of all the connections feeding into the output layer.

READ FULL TEXT
research
03/29/2023

Backpropagation and F-adjoint

This paper presents a concise mathematical framework for investigating b...
research
06/19/2018

Contrastive Hebbian Learning with Random Feedback Weights

Neural networks are commonly trained to make predictions through learnin...
research
05/21/2023

Layer Collaboration in the Forward-Forward Algorithm

Backpropagation, which uses the chain rule, is the de-facto standard alg...
research
06/18/2020

Accelerating Training in Artificial Neural Networks with Dynamic Mode Decomposition

Training of deep neural networks (DNNs) frequently involves optimizing s...
research
10/19/2018

Gradient target propagation

We report a learning rule for neural networks that computes how much eac...
research
10/22/2020

Identifying Learning Rules From Neural Network Observables

The brain modifies its synaptic strengths during learning in order to be...
research
07/29/2020

Deriving Differential Target Propagation from Iterating Approximate Inverses

We show that a particular form of target propagation, i.e., relying on l...

Please sign up or login with your details

Forgot password? Click here to reset