On the Convergence of Nesterov's Accelerated Gradient Method in Stochastic Settings

02/27/2020
by   Mahmoud Assran, et al.
0

We study Nesterov's accelerated gradient method in the stochastic approximation setting (unbiased gradients with bounded variance) and the finite-sum setting (where randomness is due to sampling mini-batches). To build better insight into the behavior of Nesterov's method in stochastic settings, we focus throughout on objectives that are smooth, strongly-convex, and twice continuously differentiable. In the stochastic approximation setting, Nesterov's method converges to a neighborhood of the optimal point at the same accelerated rate as in the deterministic setting. Perhaps surprisingly, in the finite-sum setting, we prove that Nesterov's method may diverge with the usual choice of step-size and momentum, unless additional conditions on the problem related to conditioning and data coherence are satisfied. Our results shed light as to why Nesterov's method may fail to converge or achieve acceleration in the finite-sum setting.

READ FULL TEXT
research
10/07/2018

ASVRG: Accelerated Proximal SVRG

This paper proposes an accelerated proximal stochastic variance reduced ...
research
07/07/2020

An Accelerated DFO Algorithm for Finite-sum Convex Functions

Derivative-free optimization (DFO) has recently gained a lot of momentum...
research
02/07/2022

Nesterov Accelerated Shuffling Gradient Method for Convex Optimization

In this paper, we propose Nesterov Accelerated Shuffling Gradient (NASG)...
research
05/29/2019

A unified variance-reduced accelerated gradient method for convex optimization

We propose a novel randomized incremental gradient algorithm, namely, VA...
research
01/22/2019

Accelerated Linear Convergence of Stochastic Momentum Methods in Wasserstein Distances

Momentum methods such as Polyak's heavy ball (HB) method, Nesterov's acc...
research
08/30/2020

Momentum-based Accelerated Mirror Descent Stochastic Approximation for Robust Topology Optimization under Stochastic Loads

Robust topology optimization (RTO) improves the robustness of designs wi...
research
09/30/2021

Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization

We show that stochastic acceleration can be achieved under the perturbed...

Please sign up or login with your details

Forgot password? Click here to reset