Tradeoffs between convergence rate and noise amplification for momentum-based accelerated optimization algorithms

09/24/2022
by   Hesameddin Mohammadi, et al.
0

We study momentum-based first-order optimization algorithms in which the iterations utilize information from the two previous steps and are subject to an additive white noise. This class of algorithms includes heavy-ball and Nesterov's accelerated methods as special cases. For strongly convex quadratic problems, we use the steady-state variance of the error in the optimization variable to quantify noise amplification and exploit a novel geometric viewpoint to establish analytical lower bounds on the product between the settling time and the smallest/largest achievable noise amplification. For all stabilizing parameters, these bounds scale quadratically with the condition number. We also use the geometric insight developed in the paper to introduce two parameterized families of algorithms that strike a balance between noise amplification and settling time while preserving order-wise Pareto optimality. Finally, for a class of continuous-time gradient flow dynamics, whose suitable discretization yields two-step momentum algorithm, we establish analogous lower bounds that also scale quadratically with the condition number.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2019

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's acc...
research
08/05/2020

Differentially Private Accelerated Optimization Algorithms

We present two classes of differentially private optimization algorithms...
research
02/28/2020

Optimization with Momentum: Dynamical, Control-Theoretic, and Symplectic Perspectives

We analyze the convergence rate of various momentum-based optimization a...
research
03/14/2021

Transient growth of accelerated first-order methods for strongly convex optimization problems

Optimization algorithms are increasingly being used in applications with...
research
05/27/2019

Robustness of accelerated first-order algorithms for strongly convex optimization problems

We study the robustness of accelerated first-order algorithms to stochas...
research
12/11/2018

On the Curved Geometry of Accelerated Optimization

In this work we propose a differential geometric motivation for Nesterov...
research
08/08/2022

A high-resolution dynamical view on momentum methods for over-parameterized neural networks

In this paper, we present the convergence analysis of momentum methods i...

Please sign up or login with your details

Forgot password? Click here to reset