The Instability of Accelerated Gradient Descent

02/03/2021
by   Amit Attia, et al.
0

We study the algorithmic stability of Nesterov's accelerated gradient method. For convex quadratic objectives, <cit.> proved that the uniform stability of the method grows quadratically with the number of optimization steps, and conjectured that the same is true for the general convex and smooth case. We disprove this conjecture and show, for two notions of stability, that the stability of Nesterov's accelerated method in fact deteriorates exponentially fast with the number of gradient steps. This stands in sharp contrast to the bounds in the quadratic case, but also to known results for non-accelerated gradient methods where stability typically grows linearly with the number of steps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/03/2020

Complexity Guarantees for Polyak Steps with Momentum

In smooth strongly convex optimization, or in the presence of Hölderian ...
research
06/16/2020

Federated Accelerated Stochastic Gradient Descent

We propose Federated Accelerated Stochastic Gradient Descent (FedAc), a ...
research
04/04/2018

Stability and Convergence Trade-off of Iterative Optimization Algorithms

The overall performance or expected excess risk of an iterative machine ...
research
02/27/2022

Stability vs Implicit Bias of Gradient Methods on Separable Data and Beyond

An influential line of recent work has focused on the generalization pro...
research
07/06/2014

Linear Coupling: An Ultimate Unification of Gradient and Mirror Descent

First-order methods play a central role in large-scale machine learning....
research
11/25/2021

Multi-fidelity Stability for Graph Representation Learning

In the problem of structured prediction with graph representation learni...
research
11/19/2020

A Stable High-order Tuner for General Convex Functions

Iterative gradient-based algorithms have been increasingly applied for t...

Please sign up or login with your details

Forgot password? Click here to reset