Gravity Optimizer: a Kinematic Approach on Optimization in Deep Learning

01/22/2021
by   Dariush Bahrami, et al.
0

We introduce Gravity, another algorithm for gradient-based optimization. In this paper, we explain how our novel idea change parameters to reduce the deep learning model's loss. It has three intuitive hyper-parameters that the best values for them are proposed. Also, we propose an alternative to moving average. To compare the performance of the Gravity optimizer with two common optimizers, Adam and RMSProp, five standard datasets were trained on two VGGNet models with a batch size of 128 for 100 epochs. Gravity hyper-parameters did not need to be tuned for different models. As will be explained more in the paper, to investigate the direct impact of the optimizer itself on loss reduction no overfitting prevention technique was used. The obtained results show that the Gravity optimizer has more stable performance than Adam and RMSProp and gives greater values of validation accuracy for datasets with more output classes like CIFAR-100 (Fine).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/05/2021

Evaluating Deep Learning in SystemML using Layer-wise Adaptive Rate Scaling(LARS) Optimizer

Increasing the batch size of a deep learning model is a challenging task...
research
03/06/2023

Judging Adam: Studying the Performance of Optimization Methods on ML4SE Tasks

Solving a problem with a deep learning model requires researchers to opt...
research
06/02/2023

Leveraging the Triple Exponential Moving Average for Fast-Adaptive Moment Estimation

Network optimization is a crucial step in the field of deep learning, as...
research
09/14/2020

Deforming the Loss Surface to Affect the Behaviour of the Optimizer

In deep learning, it is usually assumed that the optimization process is...
research
02/15/2022

A Light-Weight Multi-Objective Asynchronous Hyper-Parameter Optimizer

We describe a light-weight yet performant system for hyper-parameter opt...
research
06/25/2021

Ranger21: a synergistic deep learning optimizer

As optimizers are critical to the performances of neural networks, every...
research
11/04/2020

EAdam Optimizer: How ε Impact Adam

Many adaptive optimization methods have been proposed and used in deep l...

Please sign up or login with your details

Forgot password? Click here to reset