DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

07/26/2022
by   Adel Nabli, et al.
0

DADAO is a novel decentralized asynchronous stochastic algorithm to minimize a sum of L-smooth and μ-strongly convex functions distributed over a time-varying connectivity network of size n. We model the local gradient updates and gossip communication procedures with separate independent Poisson Point Processes, decoupling the computation and communication steps in addition to making the whole approach completely asynchronous. Our method employs primal gradients and do not use a multi-consensus inner loop nor other ad-hoc mechanisms as Error Feedback, Gradient Tracking or a Proximal operator. By relating spatial quantities of our graphs χ^*_1,χ_2^* to a necessary minimal communication rate between nodes of the network, we show that our algorithm requires 𝒪(n√(L/μ)logϵ) local gradients and only 𝒪(n√(χ_1^*χ_2^*)√(L/μ)logϵ) communications to reach a precision ϵ. If SGD with uniform noise σ^2 is used, we reach a precision ϵ with same speed, up to a bias term in 𝒪(σ^2/√(μ L)). This improves upon the bounds obtained with current state-of-the-art approaches, our simulations validating the strength of our relatively unconstrained method. Our source-code is released on a public repository.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2019

Asynchronous Accelerated Proximal Stochastic Gradient for Strongly Convex Distributed Finite Sums

In this work, we study the problem of minimizing the sum of strongly con...
research
10/05/2018

Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

In this paper, we study the problem of minimizing a sum of smooth and st...
research
04/06/2021

Accelerated Gradient Tracking over Time-varying Graphs for Decentralized Optimization

Decentralized optimization over time-varying graphs has been increasingl...
research
06/08/2021

Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks

We consider the task of minimizing the sum of smooth and strongly convex...
research
02/18/2021

ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

We propose ADOM - an accelerated method for smooth and strongly convex d...
research
05/27/2019

An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums

Modern large-scale finite-sum optimization relies on two key aspects: di...

Please sign up or login with your details

Forgot password? Click here to reset