DeepAI AI Chat
Log In Sign Up

Curvature Injected Adaptive Momentum Optimizer for Convolutional Neural Networks

09/26/2021
by   Shiv Ram Dubey, et al.
0

In this paper, we propose a new approach, hereafter referred as AdaInject, for the gradient descent optimizers by injecting the curvature information with adaptive momentum. Specifically, the curvature information is used as a weight to inject the second order moment in the update rule. The curvature information is captured through the short-term parameter history. The AdaInject approach boosts the parameter update by exploiting the curvature information. The proposed approach is generic in nature and can be integrated with any existing adaptive momentum stochastic gradient descent optimizers. The effectiveness of the AdaInject optimizer is tested using a theoretical analysis as well as through toy examples. We also show the convergence property of the proposed injection based optimizer. Further, we depict the efficacy of the AdaInject approach through extensive experiments in conjunction with the state-of-the-art optimizers, i.e., AdamInject, diffGradInject, RadamInject, and AdaBeliefInject on four benchmark datasets. Different CNN models are used in the experiments. A highest improvement in the top-1 classification error rate of 16.54% is observed using diffGradInject optimizer with ResNeXt29 model over the CIFAR10 dataset. Overall, we observe very promising performance improvement of existing optimizers with the proposed AdaInject approach.

READ FULL TEXT

page 1

page 10

07/19/2022

Moment Centralization based Gradient Descent Optimizers for Convolutional Neural Networks

Convolutional neural networks (CNNs) have shown very appealing performan...
03/28/2019

PAL: A fast DNN optimization method based on curvature information

We present a novel optimizer for deep neural networks that combines the ...
12/29/2018

SPI-Optimizer: an integral-Separated PI Controller for Stochastic Optimization

To overcome the oscillation problem in the classical momentum-based opti...
10/12/2022

AdaNorm: Adaptive Gradient Norm Correction based Optimizer for CNNs

The stochastic gradient descent (SGD) optimizers are generally used to t...
05/21/2021

AngularGrad: A New Optimization Technique for Angular Convergence of Convolutional Neural Networks

Convolutional neural networks (CNNs) are trained using stochastic gradie...
07/16/2019

SGD momentum optimizer with step estimation by online parabola model

In stochastic gradient descent, especially for neural network training, ...
10/11/2019

On Empirical Comparisons of Optimizers for Deep Learning

Selecting an optimizer is a central step in the contemporary deep learni...