Accelerating Perturbed Stochastic Iterates in Asynchronous Lock-Free Optimization

09/30/2021
by   Kaiwen Zhou, et al.
0

We show that stochastic acceleration can be achieved under the perturbed iterate framework (Mania et al., 2017) in asynchronous lock-free optimization, which leads to the optimal incremental gradient complexity for finite-sum objectives. We prove that our new accelerated method requires the same linear speed-up condition as the existing non-accelerated methods. Our core algorithmic discovery is a new accelerated SVRG variant with sparse updates. Empirical results are presented to verify our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/27/2018

Accelerating Asynchronous Algorithms for Convex Optimization by Momentum Compensation

Asynchronous algorithms have attracted much attention recently due to th...
research
04/15/2023

Stochastic Distributed Optimization under Average Second-order Similarity: Algorithms and Analysis

We study finite-sum distributed optimization problems with n-clients und...
research
10/23/2020

Escape saddle points faster on manifolds via perturbed Riemannian stochastic recursive gradient

In this paper, we propose a variant of Riemannian stochastic recursive g...
research
06/16/2020

Federated Accelerated Stochastic Gradient Descent

We propose Federated Accelerated Stochastic Gradient Descent (FedAc), a ...
research
11/04/2020

Asynchrony and Acceleration in Gossip Algorithms

This paper considers the minimization of a sum of smooth and strongly co...
research
02/27/2020

On the Convergence of Nesterov's Accelerated Gradient Method in Stochastic Settings

We study Nesterov's accelerated gradient method in the stochastic approx...
research
01/11/2018

Improved asynchronous parallel optimization analysis for stochastic incremental methods

As datasets continue to increase in size and multi-core computer archite...

Please sign up or login with your details

Forgot password? Click here to reset