Backprojection for Training Feedforward Neural Networks in the Input and Feature Spaces

04/05/2020
by   Benyamin Ghojogh, et al.
0

After the tremendous development of neural networks trained by backpropagation, it is a good time to develop other algorithms for training neural networks to gain more insights into networks. In this paper, we propose a new algorithm for training feedforward neural networks which is fairly faster than backpropagation. This method is based on projection and reconstruction where, at every layer, the projected data and reconstructed labels are forced to be similar and the weights are tuned accordingly layer by layer. The proposed algorithm can be used for both input and feature spaces, named as backprojection and kernel backprojection, respectively. This algorithm gives an insight to networks with a projection-based perspective. The experiments on synthetic datasets show the effectiveness of the proposed method.

READ FULL TEXT

page 7

page 8

research
05/16/2020

An Effective and Efficient Training Algorithm for Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
05/16/2020

An Effective and Efficient Initialization Scheme for Training Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
05/16/2020

An Effective and Efficient Initialization Scheme for Multi-layer Feedforward Neural Networks

Network initialization is the first and critical step for training neura...
research
02/09/2023

SparseProp: Efficient Sparse Backpropagation for Faster Training of Neural Networks

We provide a new efficient version of the backpropagation algorithm, spe...
research
05/03/2018

Local Critic Training for Model-Parallel Learning of Deep Neural Networks

This paper proposes a novel approach to train deep neural networks in a ...
research
08/28/2023

Fast Feedforward Networks

We break the linear link between the layer size and its inference cost b...
research
04/02/2019

On Geometric Structure of Activation Spaces in Neural Networks

In this paper, we investigate the geometric structure of activation spac...

Please sign up or login with your details

Forgot password? Click here to reset