A Primal-Dual Framework for Decentralized Stochastic Optimization

12/08/2020
by   Ketan Rajawat, et al.
0

We consider the decentralized convex optimization problem, where multiple agents must cooperatively minimize a cumulative objective function, with each local function expressible as an empirical average of data-dependent losses. State-of-the-art approaches for decentralized optimization rely on gradient tracking, where consensus is enforced via a doubly stochastic mixing matrix. Construction of such mixing matrices is not straightforward and requires coordination even prior to the start of the optimization algorithm. This paper puts forth a primal-dual framework for decentralized stochastic optimization that obviates the need for such doubly stochastic matrices. Instead, dual variables are maintained to track the disagreement between neighbors. The proposed framework is flexible and is used to develop decentralized variants of SAGA, L-SVRG, SVRG++, and SEGA algorithms. Using a unified proof, we establish that the oracle complexity of these decentralized variants is O(1/ϵ), matching the complexity bounds obtained for the centralized variants. Additionally, we also present a decentralized primal-dual accelerated SVRG algorithm achieving O(1/√(ϵ)) oracle complexity, again matching the bound for the centralized accelerated SVRG. Numerical tests on the algorithms establish their superior performance as compared to the variance-reduced gradient tracking algorithms.

READ FULL TEXT
research
09/24/2018

Asynchronous decentralized accelerated stochastic gradient descent

In this work, we introduce an asynchronous decentralized accelerated sto...
research
02/12/2021

A hybrid variance-reduced method for decentralized stochastic non-convex optimization

This paper considers decentralized stochastic optimization over a networ...
research
07/23/2022

A Dual Accelerated Method for Online Stochastic Distributed Averaging: From Consensus to Decentralized Policy Evaluation

Motivated by decentralized sensing and policy evaluation problems, we co...
research
05/20/2018

Communication-Efficient Projection-Free Algorithm for Distributed Optimization

Distributed optimization has gained a surge of interest in recent years....
research
08/01/2019

Adaptive Kernel Learning in Heterogeneous Networks

We consider the framework of learning over decentralized networks, where...
research
09/12/2019

Communication-Efficient Distributed Optimization in Networks with Gradient Tracking

There is a growing interest in large-scale machine learning and optimiza...
research
02/08/2019

A Smoother Way to Train Structured Prediction Models

We present a framework to train a structured prediction model by perform...

Please sign up or login with your details

Forgot password? Click here to reset