Layer Collaboration in the Forward-Forward Algorithm

05/21/2023
by   Guy Lorberbom, et al.
0

Backpropagation, which uses the chain rule, is the de-facto standard algorithm for optimizing neural networks nowadays. Recently, Hinton (2022) proposed the forward-forward algorithm, a promising alternative that optimizes neural nets layer-by-layer, without propagating gradients throughout the network. Although such an approach has several advantages over back-propagation and shows promising results, the fact that each layer is being trained independently limits the optimization process. Specifically, it prevents the network's layers from collaborating to learn complex and rich features. In this work, we study layer collaboration in the forward-forward algorithm. We show that the current version of the forward-forward algorithm is suboptimal when considering information flow in the network, resulting in a lack of collaboration between layers of the network. We propose an improved version that supports layer collaboration to better utilize the network structure, while not requiring any additional assumptions or computations. We empirically demonstrate the efficacy of the proposed version when considering both information flow and objective metrics. Additionally, we provide a theoretical motivation for the proposed method, inspired by functional entropy theory.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2023

Graph Neural Networks Go Forward-Forward

We present the Graph Forward-Forward (GFF) algorithm, an extension of th...
research
03/17/2023

A Two-Step Rule for Backpropagation

We present a simplified computational rule for the back-propagation form...
research
09/21/2023

A Study of Forward-Forward Algorithm for Self-Supervised Learning

Self-supervised representation learning has seen remarkable progress in ...
research
04/16/2021

Arithmetic Distribution Neural Network for Background Subtraction

We propose a new Arithmetic Distribution Neural Network (ADNN) for learn...
research
06/08/2017

Forward Thinking: Building and Training Neural Networks One Layer at a Time

We present a general framework for training deep neural networks without...
research
07/09/2023

Extending the Forward Forward Algorithm

The Forward Forward algorithm, proposed by Geoffrey Hinton in November 2...
research
06/04/2018

Backdrop: Stochastic Backpropagation

We introduce backdrop, a flexible and simple-to-implement method, intuit...

Please sign up or login with your details

Forgot password? Click here to reset