On the convergence result of the gradient-push algorithm on directed graphs with constant stepsize

02/17/2023
by   Woocheol Choi, et al.
0

Gradient-push algorithm has been widely used for decentralized optimization problems when the connectivity network is a direct graph. This paper shows that the gradient-push algorithm with stepsize α>0 converges exponentially fast to an O(α)-neighborhood of the optimizer under the assumption that each cost is smooth and the total cost is strongly convex. Numerical experiments are provided to support the theoretical convergence results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/13/2020

Push-SAGA: A decentralized stochastic algorithm with variance reduction over directed graphs

In this paper, we propose Push-SAGA, a decentralized stochastic first-or...
research
03/31/2020

A Robust Gradient Tracking Method for Distributed Optimization over Directed Networks

In this paper, we consider the problem of distributed consensus optimiza...
research
05/26/2018

PAM: When Overloaded, Push Your Neighbor Aside!

Recently SmartNICs are widely used to accelerate service chains in NFV. ...
research
10/30/2020

To Push or To Pull: On Reducing Communication and Synchronization in Graph Computations

We reduce the cost of communication and synchronization in graph process...
research
07/20/2020

Push, Stop, and Replan: An Application of Pebble Motion on Graphs to Planning in Automated Warehouses

The pebble-motion on graphs is a subcategory of multi-agent pathfinding ...
research
02/07/2023

Two Parallel PageRank Algorithms via Improving Forward Push

Initially used to rank web pages, PageRank has now been applied in many ...
research
06/14/2021

Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks

In this paper, we propose two communication-efficient algorithms for dec...

Please sign up or login with your details

Forgot password? Click here to reset