The Cascaded Forward Algorithm for Neural Network Training

03/17/2023
by   Gongpei Zhao, et al.
0

Backpropagation algorithm has been widely used as a mainstream learning procedure for neural networks in the past decade, and has played a significant role in the development of deep learning. However, there exist some limitations associated with this algorithm, such as getting stuck in local minima and experiencing vanishing/exploding gradients, which have led to questions about its biological plausibility. To address these limitations, alternative algorithms to backpropagation have been preliminarily explored, with the Forward-Forward (FF) algorithm being one of the most well-known. In this paper we propose a new learning framework for neural networks, namely Cascaded Forward (CaFo) algorithm, which does not rely on BP optimization as that in FF. Unlike FF, our framework directly outputs label distributions at each cascaded block, which does not require generation of additional negative samples and thus leads to a more efficient process at both training and testing. Moreover, in our framework each block can be trained independently, so it can be easily deployed into parallel acceleration systems. The proposed method is evaluated on four public image classification benchmarks, and the experimental results illustrate significant improvement in prediction accuracy in comparison with the baseline.

READ FULL TEXT
research
05/26/2023

Emergent representations in networks trained with the Forward-Forward algorithm

The Backpropagation algorithm, widely used to train neural networks, has...
research
05/22/2023

The Integrated Forward-Forward Algorithm: Integrating Forward-Forward and Shallow Backpropagation With Local Losses

The backpropagation algorithm, despite its widespread use in neural netw...
research
02/10/2023

Graph Neural Networks Go Forward-Forward

We present the Graph Forward-Forward (GFF) algorithm, an extension of th...
research
03/15/2023

SymBa: Symmetric Backpropagation-Free Contrastive Learning with Forward-Forward Algorithm for Optimizing Convergence

The paper proposes a new algorithm called SymBa that aims to achieve mor...
research
05/03/2018

Local Critic Training for Model-Parallel Learning of Deep Neural Networks

This paper proposes a novel approach to train deep neural networks in a ...
research
07/09/2023

Extending the Forward Forward Algorithm

The Forward Forward algorithm, proposed by Geoffrey Hinton in November 2...
research
11/25/2020

Backpropagation-Free Learning Method for Correlated Fuzzy Neural Networks

In this paper, a novel stepwise learning approach based on estimating de...

Please sign up or login with your details

Forgot password? Click here to reset