S-ADDOPT: Decentralized stochastic first-order optimization over directed graphs

05/15/2020
by   Muhammad I. Qureshi, et al.
0

In this report, we study decentralized stochastic optimization to minimize a sum of smooth and strongly convex cost functions when the functions are distributed over a directed network of nodes. In contrast to the existing work, we use gradient tracking to improve certain aspects of the resulting algorithm. In particular, we propose the S-ADDOPT algorithm that assumes a stochastic first-order oracle at each node and show that for a constant step-size α, each node converges linearly inside an error ball around the optimal solution, the size of which is controlled by α. For decaying step-sizes O(1/k), we show that S-ADDOPT reaches the exact solution sublinearly at O(1/k) and its convergence is asymptotically network-independent. Thus the asymptotic behavior of S-ADDOPT is comparable to the centralized stochastic gradient descent. Numerical experiments over both strongly convex and non-convex problems illustrate the convergence behavior and the performance comparison of the proposed algorithm.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

08/13/2020

Push-SAGA: A decentralized stochastic algorithm with variance reduction over directed graphs

In this paper, we propose Push-SAGA, a decentralized stochastic first-or...
03/18/2019

Distributed stochastic optimization with gradient tracking over strongly-connected networks

In this paper, we study distributed stochastic optimization to minimize ...
02/12/2021

A hybrid variance-reduced method for decentralized stochastic non-convex optimization

This paper considers decentralized stochastic optimization over a networ...
06/11/2018

Swarming for Faster Convergence in Stochastic Optimization

We study a distributed framework for stochastic optimization which is in...
02/09/2020

Linearly Convergent Algorithm with Variance Reduction for Distributed Stochastic Optimization

This paper considers a distributed stochastic strongly convex optimizati...
08/28/2019

Linear Convergence of Adaptive Stochastic Gradient Descent

We prove that the norm version of the adaptive stochastic gradient metho...
05/17/2021

Removing Data Heterogeneity Influence Enhances Network Topology Dependence of Decentralized SGD

We consider decentralized stochastic optimization problems where a netwo...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.