The Instability of Accelerated Gradient Descent

02/03/2021
by   Amit Attia, et al.
0

We study the algorithmic stability of Nesterov's accelerated gradient method. For convex quadratic objectives, <cit.> proved that the uniform stability of the method grows quadratically with the number of optimization steps, and conjectured that the same is true for the general convex and smooth case. We disprove this conjecture and show, for two notions of stability, that the stability of Nesterov's accelerated method in fact deteriorates exponentially fast with the number of gradient steps. This stands in sharp contrast to the bounds in the quadratic case, but also to known results for non-accelerated gradient methods where stability typically grows linearly with the number of steps.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset