ADOM: Accelerated Decentralized Optimization Method for Time-Varying Networks

02/18/2021
by   Dmitry Kovalev, et al.
10

We propose ADOM - an accelerated method for smooth and strongly convex decentralized optimization over time-varying networks. ADOM uses a dual oracle, i.e., we assume access to the gradient of the Fenchel conjugate of the individual loss functions. Up to a constant factor, which depends on the network structure only, its communication complexity is the same as that of accelerated Nesterov gradient method (Nesterov, 2003). To the best of our knowledge, only the algorithm of Rogozin et al. (2019) has a convergence rate with similar properties. However, their algorithm converges under the very restrictive assumption that the number of network changes can not be greater than a tiny percentage of the number of iterations. This assumption is hard to satisfy in practice, as the network topology changes usually can not be controlled. In contrast, ADOM merely requires the network to stay connected throughout time.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2018

Accelerated Decentralized Optimization with Local Updates for Smooth and Strongly Convex Objectives

In this paper, we study the problem of minimizing a sum of smooth and st...
research
04/06/2021

Accelerated Gradient Tracking over Time-varying Graphs for Decentralized Optimization

Decentralized optimization over time-varying graphs has been increasingl...
research
11/10/2015

Asynchronous Decentralized 20 Questions for Adaptive Search

This paper considers the problem of adaptively searching for an unknown ...
research
11/04/2020

Asynchrony and Acceleration in Gossip Algorithms

This paper considers the minimization of a sum of smooth and strongly co...
research
09/08/2020

Accelerated Multi-Agent Optimization Method over Stochastic Networks

We propose a distributed method to solve a multi-agent optimization prob...
research
07/26/2022

DADAO: Decoupled Accelerated Decentralized Asynchronous Optimization for Time-Varying Gossips

DADAO is a novel decentralized asynchronous stochastic algorithm to mini...
research
06/08/2021

Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex Decentralized Optimization Over Time-Varying Networks

We consider the task of minimizing the sum of smooth and strongly convex...

Please sign up or login with your details

Forgot password? Click here to reset