Transient growth of accelerated first-order methods for strongly convex optimization problems

03/14/2021
by   Hesameddin Mohammadi, et al.
0

Optimization algorithms are increasingly being used in applications with limited time budgets. In many real-time and embedded scenarios, only a few iterations can be performed and traditional convergence metrics cannot be used to evaluate performance in these non-asymptotic regimes. In this paper, we examine the transient behavior of accelerated first-order optimization algorithms. For quadratic optimization problems, we employ tools from linear systems theory to show that transient growth arises from the presence of non-normal dynamics. We identify the existence of modes that yield an algebraic growth in early iterations and quantify the transient excursion from the optimal solution caused by these modes. For strongly convex smooth optimization problems, we utilize the theory of integral quadratic constraints to establish an upper bound on the magnitude of the transient response of Nesterov's accelerated method. We show that both the Euclidean distance between the optimization variable and the global minimizer and the rise time to the transient peak are proportional to the square root of the condition number of the problem. Finally, for problems with large condition numbers, we demonstrate tightness of the bounds that we derive up to constant factors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/09/2023

Gauges and Accelerated Optimization over Smooth and/or Strongly Convex Sets

We consider feasibility and constrained optimization problems defined ov...
research
05/27/2019

Robustness of accelerated first-order algorithms for strongly convex optimization problems

We study the robustness of accelerated first-order algorithms to stochas...
research
09/24/2022

Tradeoffs between convergence rate and noise amplification for momentum-based accelerated optimization algorithms

We study momentum-based first-order optimization algorithms in which the...
research
12/07/2022

Generalized Gradient Flows with Provable Fixed-Time Convergence and Fast Evasion of Non-Degenerate Saddle Points

Gradient-based first-order convex optimization algorithms find widesprea...
research
12/09/2015

RSG: Beating Subgradient Method without Smoothness and Strong Convexity

In this paper, we study the efficiency of a Restarted Sub Gradient (RS...
research
11/15/2022

Adjoint Variable Method for Transient Nonlinear Electroquasistatic Problems

Many optimization problems in electrical engineering consider a large nu...
research
06/17/2023

Distributed Accelerated Projection-Based Consensus Decomposition

With the development of machine learning and Big Data, the concepts of l...

Please sign up or login with your details

Forgot password? Click here to reset