Robustness of accelerated first-order algorithms for strongly convex optimization problems

05/27/2019
by   Hesameddin Mohammadi, et al.
0

We study the robustness of accelerated first-order algorithms to stochastic uncertainties in gradient evaluation. Specifically, for unconstrained, smooth, strongly convex optimization problems, we examine the mean-square error in the optimization variable when the iterates are perturbed by additive white noise. This type of uncertainty may arise in situations where an approximation of the gradient is sought through measurements of a real system or in a distributed computation over network. Even though the underlying dynamics of first-order algorithms for this class of problems are nonlinear, we establish upper bounds on the mean-square deviation from the optimal value that are tight up to constant factors. Our analysis quantifies fundamental trade-offs between noise amplification and convergence rates obtained via any acceleration scheme similar to Nesterov's or heavy-ball methods. To gain additional analytical insight, for strongly convex quadratic problems we explicitly evaluate the steady-state variance of the optimization variable in terms of the eigenvalues of the Hessian of the objective function. We demonstrate that the entire spectrum of the Hessian, rather than just the extreme eigenvalues, influence robustness of noisy algorithms. We specialize this result to the problem of distributed averaging over undirected networks and examine the role of network size and topology on the robustness of noisy accelerated algorithms.

READ FULL TEXT
research
09/03/2018

A Dual Approach for Optimal Algorithms in Distributed Optimization over Networks

We study the optimal convergence rates for distributed convex optimizati...
research
12/01/2017

Optimal Algorithms for Distributed Optimization

In this paper, we study the optimal convergence rate for distributed con...
research
03/14/2021

Transient growth of accelerated first-order methods for strongly convex optimization problems

Optimization algorithms are increasingly being used in applications with...
research
09/24/2022

Tradeoffs between convergence rate and noise amplification for momentum-based accelerated optimization algorithms

We study momentum-based first-order optimization algorithms in which the...
research
11/02/2022

Large deviations rates for stochastic gradient descent with strongly convex functions

Recent works have shown that high probability metrics with stochastic gr...
research
11/05/2020

Accelerated Additive Schwarz Methods for Convex Optimization with Adaptive Restart

Based on an observation that additive Schwarz methods for general convex...
research
02/17/2016

Harder, Better, Faster, Stronger Convergence Rates for Least-Squares Regression

We consider the optimization of a quadratic objective function whose gra...

Please sign up or login with your details

Forgot password? Click here to reset