A Variational Perspective on Accelerated Methods in Optimization

03/14/2016
by   Andre Wibisono, et al.
0

Accelerated gradient methods play a central role in optimization, achieving optimal rates in many settings. While many generalizations and extensions of Nesterov's original acceleration method have been proposed, it is not yet clear what is the natural scope of the acceleration concept. In this paper, we study accelerated methods from a continuous-time perspective. We show that there is a Lagrangian functional that we call the Bregman Lagrangian which generates a large class of accelerated methods in continuous time, including (but not limited to) accelerated gradient descent, its non-Euclidean extension, and accelerated higher-order gradient methods. We show that the continuous-time limit of all of these methods correspond to traveling the same curve in spacetime at different speeds. From this perspective, Nesterov's technique and many of its generalizations can be viewed as a systematic way to go from the continuous-time curves generated by the Bregman Lagrangian to a family of discrete-time accelerated algorithms.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2017

Accelerated Extra-Gradient Descent: A Novel Accelerated First-Order Method

We provide a novel accelerated first-order method that achieves the asym...
research
02/10/2018

On Symplectic Optimization

Accelerated gradient methods have had significant impact in machine lear...
research
12/09/2021

A More Stable Accelerated Gradient Method Inspired by Continuous-Time Perspective

Nesterov's accelerated gradient method (NAG) is widely used in problems ...
research
05/25/2022

A systematic approach to Lyapunov analyses of continuous-time models in convex optimization

First-order methods are often analyzed via their continuous-time models,...
research
09/10/2020

Analysis of Theoretical and Numerical Properties of Sequential Convex Programming for Continuous-Time Optimal Control

Through the years, Sequential Convex Programming (SCP) has gained great ...
research
08/13/2018

Relax, and Accelerate: A Continuous Perspective on ADMM

The acceleration technique first introduced by Nesterov for gradient des...
research
01/20/2022

Accelerated Gradient Flow: Risk, Stability, and Implicit Regularization

Acceleration and momentum are the de facto standard in modern applicatio...

Please sign up or login with your details

Forgot password? Click here to reset