Parametric complexity analysis for a class of first-order Adagrad-like algorithms

03/03/2022
by   S. Gratton, et al.
0

A class of algorithms for optimization in the presence of noise is presented, that does not require the evaluation of the objective function. This class generalizes the well-known Adagrad method. The complexity of this class is then analyzed as a function of its parameters, and it is shown that some methods of the class enjoy a better asymptotic convergence rate than previously known. A new class of algorithms is then derived with similar characteristics. Initial numerical experiments suggest that it may have some merits in practice.

READ FULL TEXT

page 14

page 16

research
02/14/2023

Multilevel Objective-Function-Free Optimization with an Application to Neural Networks Training

A class of multi-level algorithms for unconstrained nonlinear optimizati...
research
10/22/2018

The Bregman chord divergence

Distances are fundamental primitives whose choice significantly impacts ...
research
11/02/2018

Proximal Gradient Method for Manifold Optimization

This paper considers manifold optimization problems with nonsmooth and n...
research
06/18/2020

Improving the Convergence Rate of One-Point Zeroth-Order Optimization using Residual Feedback

Many existing zeroth-order optimization (ZO) algorithms adopt two-point ...
research
01/05/2023

Restarts subject to approximate sharpness: A parameter-free and optimal scheme for first-order methods

Sharpness is an almost generic assumption in continuous optimization tha...
research
04/06/2021

The Impact of Noise on Evaluation Complexity: The Deterministic Trust-Region Case

Intrinsic noise in objective function and derivatives evaluations may ca...
research
01/13/2018

On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales

The necessity to find the global optimum of multiextremal functions aris...

Please sign up or login with your details

Forgot password? Click here to reset