Towards More Efficient Stochastic Decentralized Learning: Faster Convergence and Sparse Communication

05/25/2018
by   Zebang Shen, et al.
0

Recently, the decentralized optimization problem is attracting growing attention. Most existing methods are deterministic with high per-iteration cost and have a convergence rate quadratically depending on the problem condition number. Besides, the dense communication is necessary to ensure the convergence even if the dataset is sparse. In this paper, we generalize the decentralized optimization problem to a monotone operator root finding problem, and propose a stochastic algorithm named DSBA that (i) converges geometrically with a rate linearly depending on the problem condition number, and (ii) can be implemented using sparse communication only. Additionally, DSBA handles learning problems like AUC-maximization which cannot be tackled efficiently in the decentralized setting. Experiments on convex minimization and AUC-maximization validate the efficiency of our method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2019

D-SPIDER-SFO: A Decentralized Optimization Algorithm with Faster Convergence Rate for Nonconvex Problems

Decentralized optimization algorithms have attracted intensive interests...
research
08/28/2019

Stochastic AUC Maximization with Deep Neural Networks

Stochastic AUC maximization has garnered an increasing interest due to b...
research
11/04/2020

Stochastic Hard Thresholding Algorithms for AUC Maximization

In this paper, we aim to develop stochastic hard thresholding algorithms...
research
10/14/2022

Revisiting Optimal Convergence Rate for Smooth and Non-convex Stochastic Decentralized Optimization

Decentralized optimization is effective to save communication in large-s...
research
08/29/2022

DR-DSGD: A Distributionally Robust Decentralized Learning Algorithm over Graphs

In this paper, we propose to solve a regularized distributionally robust...
research
06/08/2016

Gossip Dual Averaging for Decentralized Optimization of Pairwise Functions

In decentralized networks (of sensors, connected objects, etc.), there i...
research
02/11/2022

Fast and Robust Sparsity Learning over Networks: A Decentralized Surrogate Median Regression Approach

Decentralized sparsity learning has attracted a significant amount of at...

Please sign up or login with your details

Forgot password? Click here to reset