A new regret analysis for Adam-type algorithms

03/21/2020
by   Ahmet Alacaoglu, et al.
0

In this paper, we focus on a theory-practice gap for Adam and its variants (AMSgrad, AdamNC, etc.). In practice, these algorithms are used with a constant first-order moment parameter β_1 (typically between 0.9 and 0.99). In theory, regret guarantees for online convex optimization require a rapidly decaying β_1→0 schedule. We show that this is an artifact of the standard analysis and propose a novel framework that allows us to derive optimal, data-dependent regret bounds with a constant β_1, without further assumptions. We also demonstrate the flexibility of our analysis on a wide range of different algorithms and settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/26/2019

Adaptive Regret of Convex and Smooth Functions

We investigate online convex optimization in changing environments, and ...
research
02/13/2020

Beyond No-Regret: Competitive Control via Online Optimization with Memory

This paper studies online control with adversarial disturbances using to...
research
02/09/2022

New Projection-free Algorithms for Online Convex Optimization with Adaptive Regret Guarantees

We present new efficient projection-free algorithms for online convex op...
research
05/09/2019

Non-Asymptotic Gap-Dependent Regret Bounds for Tabular MDPs

This paper establishes that optimistic algorithms attain gap-dependent a...
research
07/01/2022

Efficient Adaptive Regret Minimization

In online convex optimization the player aims to minimize her regret aga...
research
12/31/2020

Optimizing Optimizers: Regret-optimal gradient descent algorithms

The need for fast and robust optimization algorithms are of critical imp...
research
12/22/2021

A Unified Analysis Method for Online Optimization in Normed Vector Space

We present a unified analysis method that relies on the generalized cosi...

Please sign up or login with your details

Forgot password? Click here to reset