Local Convergence of Adaptive Gradient Descent Optimizers

02/19/2021
by   Sebastian Bock, et al.
0

Adaptive Moment Estimation (ADAM) is a very popular training algorithm for deep neural networks and belongs to the family of adaptive gradient descent optimizers. However to the best of the authors knowledge no complete convergence analysis exists for ADAM. The contribution of this paper is a method for the local convergence analysis in batch mode for a deterministic fixed training set, which gives necessary conditions for the hyperparameters of the ADAM algorithm. Due to the local nature of the arguments the objective function can be non-convex but must be at least twice continuously differentiable. Then we apply this procedure to other adaptive gradient descent algorithms and show for most of them local convergence with hyperparameter bounds.

READ FULL TEXT

page 9

page 10

page 14

research
12/23/2022

Exploring the Optimized Value of Each Hyperparameter in Various Gradient Descent Algorithms

In the recent years, various gradient descent algorithms including the m...
research
10/05/2022

Non-Convergence and Limit Cycles in the Adam optimizer

One of the most popular training algorithms for deep neural networks is ...
research
10/21/2019

Adaptive gradient descent without descent

We present a strikingly simple proof that two rules are sufficient to au...
research
11/03/2020

The Complexity of Gradient Descent: CLS = PPAD ∩ PLS

We study search problems that can be solved by performing Gradient Desce...
research
01/30/2019

Natural Analysts in Adaptive Data Analysis

Adaptive data analysis is frequently criticized for its pessimistic gene...
research
04/21/2020

AdaX: Adaptive Gradient Descent with Exponential Long Term Memory

Although adaptive optimization algorithms such as Adam show fast converg...
research
04/06/2021

A Caputo fractional derivative-based algorithm for optimization

We propose a novel Caputo fractional derivative-based optimization algor...

Please sign up or login with your details

Forgot password? Click here to reset