Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

10/05/2018
by   Hadrien Hendrikx, et al.
0

In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm ESDACD, a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number κ of the local functions as well as the network topology and delays. Under mild assumptions on the topology of the graph, ESDACD takes a time O((τ_ + Δ_)√(κ/γ)(ϵ^-1)) to reach a precision ϵ where γ is the spectral gap of the graph, τ_ the maximum communication delay and Δ_ the maximum computation time. Therefore, it matches the rate of SSDA, which is optimal when τ_ = Ω(Δ_). Applying ESDACD to quadratic local functions leads to an accelerated randomized gossip algorithm of rate O( √(θ_ gossip/n)) where θ_ gossip is the rate of the standard randomized gossip. To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2019

An Accelerated Decentralized Stochastic Proximal Algorithm for Finite Sums

Modern large-scale finite-sum optimization relies on two key aspects: di...
research
02/18/2021

ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

We propose ADOM - an accelerated method for smooth and strongly convex d...
research
07/26/2022

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

DADAO is a novel decentralized asynchronous stochastic algorithm to mini...
research
11/04/2020

Asynchrony and Acceleration in Gossip Algorithms

This paper considers the minimization of a sum of smooth and strongly co...
research
01/05/2023

Beyond spectral gap (extended): The role of the topology in decentralized learning

In data-parallel optimization of machine learning models, workers collab...
research
06/07/2022

Beyond spectral gap: The role of the topology in decentralized learning

In data-parallel optimization of machine learning models, workers collab...
research
01/24/2019

Asynchronous Decentralized Optimization in Directed Networks

A popular asynchronous protocol for decentralized optimization is random...

Please sign up or login with your details

Forgot password? Click here to reset