Linearly Convergent Algorithm with Variance Reduction for Distributed Stochastic Optimization

by   Jinlong Lei, et al.

This paper considers a distributed stochastic strongly convex optimization, where agents connected over a network aim to cooperatively minimize the average of all agents' local cost functions. Due to the stochasticity of gradient estimation and distributedness of local objective, fast linearly convergent distributed algorithms have not been achieved yet. This work proposes a novel distributed stochastic gradient tracking algorithm with variance reduction, where the local gradients are estimated by an increasing batch-size of sampled gradients. With an undirected connected communication graph and a geometrically increasing batch-size, the iterates are shown to converge in mean to the optimal solution at a geometric rate (achieving linear convergence). The iteration, communication, and oracle complexity for obtaining an ϵ-optimal solution are established as well. Particulary, the communication complexity is O(ln (1/ϵ)) while the oracle complexity (number of sampled gradients) is O(1/ϵ^2), which is of the same order as that of centralized approaches. Hence, the proposed scheme is communication-efficient without requiring extra sampled gradients. Numerical simulations are given to demonstrate the theoretic results.



There are no comments yet.


page 1

page 2

page 3

page 4


Distributed Stochastic Gradient Tracking Methods

In this paper, we study the problem of distributed multi-agent optimizat...

A Distributed Stochastic Gradient Tracking Method

In this paper, we study the problem of distributed multi-agent optimizat...

Distributed stochastic optimization with gradient tracking over strongly-connected networks

In this paper, we study distributed stochastic optimization to minimize ...

S-ADDOPT: Decentralized stochastic first-order optimization over directed graphs

In this report, we study decentralized stochastic optimization to minimi...

Communication-Efficient Projection-Free Algorithm for Distributed Optimization

Distributed optimization has gained a surge of interest in recent years....

Distributed stochastic projection-free solver for constrained optimization

This paper proposes a distributed stochastic projection-free algorithm f...

Optimization-based Block Coordinate Gradient Coding

Existing gradient coding schemes introduce identical redundancy across t...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.