The connections between Lyapunov functions for some optimization algorithms and differential equations

09/01/2020
by   J. M. Sanz-Serna, et al.
0

In this manuscript we study the properties of a family of a second order differential equations with damping, its discretizations and their connections with accelerated optimization algorithms for m-strongly convex and L-smooth functions. In particular, using the Linear Matrix Inequality framework developed in Fazlyab et. al. (2018), we derive analytically a (discrete) Lyapunov function for a two-parameter family of Nesterov optimization methods, which allows for a complete characterization of their convergence rate. We then show that in the appropriate limit this family of methods may be seen as a discretization of a family of second order ordinary differential equations, which properties can be also understood by a (continuous) Lyapunov function, which can also be obtained by studying the limiting behaviour of the discrete Lyapunov function. Finally, we show that the majority of typical discretizations of this ODE, such as the Heavy ball method, do not possess suitable discrete Lyapunov functions, and hence fail to reproduce the desired limiting behaviour of this ODE, which in turn implies that their converge rates when seen as optimization methods cannot behave in an "accerelated" manner .

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/15/2023

On the connections between optimization algorithms, Lyapunov functions, and differential equations: theory and insights

We study connections between differential equations and optimization alg...
research
10/31/2018

A general system of differential equations to model first order adaptive algorithms

First order optimization algorithms play a major role in large scale mac...
research
10/28/2020

Optimization Fabrics for Behavioral Design

Second-order differential equations define smooth system behavior. In ge...
research
06/06/2022

Essential convergence rate of ordinary differential equations appearing in optimization

Some continuous optimization methods can be connected to ordinary differ...
research
04/20/2023

Understanding Accelerated Gradient Methods: Lyapunov Analyses and Hamiltonian Assisted Interpretations

We formulate two classes of first-order algorithms more general than pre...
research
08/05/2020

Optimization Fabrics

This paper presents a theory of optimization fabrics, second-order diffe...
research
12/27/2021

Last-Iterate Convergence of Saddle Point Optimizers via High-Resolution Differential Equations

Several widely-used first-order saddle point optimization methods yield ...

Please sign up or login with your details

Forgot password? Click here to reset