On the convergence result of the gradient-push algorithm on directed graphs with constant stepsize

by   Woocheol Choi, et al.
HanYang University

Gradient-push algorithm has been widely used for decentralized optimization problems when the connectivity network is a direct graph. This paper shows that the gradient-push algorithm with stepsize α>0 converges exponentially fast to an O(α)-neighborhood of the optimizer under the assumption that each cost is smooth and the total cost is strongly convex. Numerical experiments are provided to support the theoretical convergence results.


page 1

page 2

page 3

page 4


Push-SAGA: A decentralized stochastic algorithm with variance reduction over directed graphs

In this paper, we propose Push-SAGA, a decentralized stochastic first-or...

A Robust Gradient Tracking Method for Distributed Optimization over Directed Networks

In this paper, we consider the problem of distributed consensus optimiza...

PAM: When Overloaded, Push Your Neighbor Aside!

Recently SmartNICs are widely used to accelerate service chains in NFV. ...

To Push or To Pull: On Reducing Communication and Synchronization in Graph Computations

We reduce the cost of communication and synchronization in graph process...

Push, Stop, and Replan: An Application of Pebble Motion on Graphs to Planning in Automated Warehouses

The pebble-motion on graphs is a subcategory of multi-agent pathfinding ...

Two Parallel PageRank Algorithms via Improving Forward Push

Initially used to rank web pages, PageRank has now been applied in many ...

Compressed Gradient Tracking for Decentralized Optimization Over General Directed Networks

In this paper, we propose two communication-efficient algorithms for dec...

Please sign up or login with your details

Forgot password? Click here to reset