ZORB: A Derivative-Free Backpropagation Algorithm for Neural Networks

11/17/2020
by   Varun Ranganathan, et al.
0

Gradient descent and backpropagation have enabled neural networks to achieve remarkable results in many real-world applications. Despite ongoing success, training a neural network with gradient descent can be a slow and strenuous affair. We present a simple yet faster training algorithm called Zeroth-Order Relaxed Backpropagation (ZORB). Instead of calculating gradients, ZORB uses the pseudoinverse of targets to backpropagate information. ZORB is designed to reduce the time required to train deep neural networks without penalizing performance. To illustrate the speed up, we trained a feed-forward neural network with 11 layers on MNIST and observed that ZORB converged 300 times faster than Adam while achieving a comparable error rate, without any hyperparameter tuning. We also broaden the scope of ZORB to convolutional neural networks, and apply it to subsamples of the CIFAR-10 dataset. Experiments on standard classification and regression benchmarks demonstrate ZORB's advantage over traditional backpropagation with Gradient Descent.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2023

Multiplexed gradient descent: Fast online training of modern datasets on hardware neural networks without backpropagation

We present multiplexed gradient descent (MGD), a gradient descent framew...
research
05/24/2019

Memorized Sparse Backpropagation

Neural network learning is typically slow since backpropagation needs to...
research
06/15/2021

Gradient-trained Weights in Wide Neural Networks Align Layerwise to Error-scaled Input Correlations

Recent works have examined how deep neural networks, which can solve a v...
research
08/17/2023

TinyProp – Adaptive Sparse Backpropagation for Efficient TinyML On-device Learning

Training deep neural networks using backpropagation is very memory and c...
research
01/06/2020

Self learning robot using real-time neural networks

With the advancements in high volume, low precision computational techno...
research
09/15/2015

Adapting Resilient Propagation for Deep Learning

The Resilient Propagation (Rprop) algorithm has been very popular for ba...
research
02/04/2022

Backpropagation Neural Tree

We propose a novel algorithm called Backpropagation Neural Tree (BNeural...

Please sign up or login with your details

Forgot password? Click here to reset