Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

10/05/2018
by   Hadrien Hendrikx, et al.
0

In this paper, we study the problem of minimizing a sum of smooth and strongly convex functions split over the nodes of a network in a decentralized fashion. We propose the algorithm ESDACD, a decentralized accelerated algorithm that only requires local synchrony. Its rate depends on the condition number κ of the local functions as well as the network topology and delays. Under mild assumptions on the topology of the graph, ESDACD takes a time O((τ_ + Δ_)√(κ/γ)(ϵ^-1)) to reach a precision ϵ where γ is the spectral gap of the graph, τ_ the maximum communication delay and Δ_ the maximum computation time. Therefore, it matches the rate of SSDA, which is optimal when τ_ = Ω(Δ_). Applying ESDACD to quadratic local functions leads to an accelerated randomized gossip algorithm of rate O( √(θ_ gossip/n)) where θ_ gossip is the rate of the standard randomized gossip. To the best of our knowledge, it is the first asynchronous gossip algorithm with a provably improved rate of convergence of the second moment of the error. We illustrate these results with experiments in idealized settings.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset