Parametric complexity analysis for a class of first-order Adagrad-like algorithms

03/03/2022
by   S. Gratton, et al.
0

A class of algorithms for optimization in the presence of noise is presented, that does not require the evaluation of the objective function. This class generalizes the well-known Adagrad method. The complexity of this class is then analyzed as a function of its parameters, and it is shown that some methods of the class enjoy a better asymptotic convergence rate than previously known. A new class of algorithms is then derived with similar characteristics. Initial numerical experiments suggest that it may have some merits in practice.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset