Distributed Nesterov gradient methods over arbitrary graphs

01/21/2019
by   Ran Xin, et al.
0

In this letter, we introduce a distributed Nesterov method, termed as ABN, that does not require doubly-stochastic weight matrices. Instead, the implementation is based on a simultaneous application of both row- and column-stochastic weights that makes this method applicable to arbitrary (strongly-connected) graphs. Since constructing column-stochastic weights needs additional information (the number of outgoing neighbors at each agent), not available in certain communication protocols, we derive a variation, termed as FROZEN, that only requires row-stochastic weights but at the expense of additional iterations for eigenvector learning. We numerically study these algorithms for various objective functions and network parameters and show that the proposed distributed Nesterov methods achieve acceleration compared to the current state-of-the-art methods for distributed optimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2019

Distributed stochastic optimization with gradient tracking over strongly-connected networks

In this paper, we study distributed stochastic optimization to minimize ...
research
02/07/2022

Variance reduced stochastic optimization over directed graphs with row and column stochastic weights

This paper proposes AB-SAGA, a first-order distributed stochastic optimi...
research
06/19/2018

Distributed Optimization over Directed Graphs with Row Stochasticity and Constraint Regularity

This paper deals with an optimization problem over a network of agents, ...
research
09/20/2018

On constructing orthogonal generalized doubly stochastic matrices

A real quadratic matrix is generalized doubly stochastic (g.d.s.) if all...
research
08/10/2016

Stochastic Rank-1 Bandits

We propose stochastic rank-1 bandits, a class of online learning problem...
research
03/28/2019

Block stochastic gradient descent for large-scale tomographic reconstruction in a parallel network

Iterative algorithms have many advantages for linear tomographic image r...
research
06/08/2022

Push–Pull with Device Sampling

We consider decentralized optimization problems in which a number of age...

Please sign up or login with your details

Forgot password? Click here to reset