Revisiting the acceleration phenomenon via high-resolution differential equations

12/12/2022
by   Shuo Chen, et al.
0

Nesterov's accelerated gradient descent (NAG) is one of the milestones in the history of first-order algorithms. It was not successfully uncovered until the high-resolution differential equation framework was proposed in [Shi et al., 2022] that the mechanism behind the acceleration phenomenon is due to the gradient correction term. To deepen our understanding of the high-resolution differential equation framework on the convergence rate, we continue to investigate NAG for the μ-strongly convex function based on the techniques of Lyapunov analysis and phase-space representation in this paper. First, we revisit the proof from the gradient-correction scheme. Similar to [Chen et al., 2022], the straightforward calculation simplifies the proof extremely and enlarges the step size to s=1/L with minor modification. Meanwhile, the way of constructing Lyapunov functions is principled. Furthermore, we also investigate NAG from the implicit-velocity scheme. Due to the difference in the velocity iterates, we find that the Lyapunov function is constructed from the implicit-velocity scheme without the additional term and the calculation of iterative difference becomes simpler. Together with the optimal step size obtained, the high-resolution differential equation framework from the implicit-velocity scheme of NAG is perfect and outperforms the gradient-correction scheme.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2022

Gradient Norm Minimization of Nesterov Acceleration: o(1/k^3)

In the history of first-order algorithms, Nesterov's accelerated gradien...
research
11/03/2022

Proximal Subgradient Norm Minimization of ISTA and FISTA

For first-order smooth optimization, the research on the acceleration ph...
research
02/11/2019

Acceleration via Symplectic Discretization of High-Resolution Differential Equations

We study first-order optimization methods obtained by discretizing ordin...
research
04/28/2023

On Underdamped Nesterov's Acceleration

The high-resolution differential equation framework has been proven to b...
research
06/16/2023

Linear convergence of Nesterov-1983 with the strong convexity

For modern gradient-based optimization, a developmental landmark is Nest...
research
12/13/2022

Linear Convergence of ISTA and FISTA

In this paper, we revisit the class of iterative shrinkage-thresholding ...
research
06/16/2020

Hessian-Free High-Resolution Nesterov Acceleration for Sampling

We propose an accelerated-gradient-based MCMC method. It relies on a mod...

Please sign up or login with your details

Forgot password? Click here to reset