Variance Reduced EXTRA and DIGing and Their Optimal Acceleration for Strongly Convex Decentralized Optimization

09/09/2020
by   Huan Li, et al.
0

We study stochastic decentralized optimization for the problem of training machine learning models with large-scale distributed data. We extend the widely used EXTRA and DIGing methods with variance reduction (VR), and propose two methods: VR-EXTRA and VR-DIGing. The proposed VR-EXTRA requires the time of O((κ_s+n)log1/ϵ) stochastic gradient evaluations and O((κ_b+κ_c)log1/ϵ) communication rounds to reach precision ϵ, where κ_s and κ_b are the stochastic condition number and batch condition number for strongly convex and smooth problems, respectively, κ_c is the condition number of the communication network, and n is the sample size on each distributed node. The proposed VR-DIGing has a little higher communication cost of O((κ_b+κ_c^2)log1/ϵ). Our stochastic gradient computation complexities are the same as the ones of single-machine VR methods, such as SAG, SAGA, and SVRG, and our communication complexities keep the same as those of EXTRA and DIGing, respectively. To further speed up the convergence, we also propose the accelerated VR-EXTRA and VR-DIGing with both the optimal O((√(nκ_s)+n)log1/ϵ) stochastic gradient computation complexity and O(√(κ_bκ_c)log1/ϵ) communication complexity. Our stochastic gradient computation complexity is also the same as the ones of single-machine accelerated VR methods, such as Katyusha, and our communication complexity keeps the same as those of accelerated full batch decentralized methods, such as MSDA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2020

Revisiting EXTRA for Smooth Distributed Optimization

EXTRA is a popular method for the dencentralized distributed optimizatio...
research
05/28/2022

Stochastic Gradient Methods with Compressed Communication for Decentralized Saddle Point Problems

We propose two stochastic gradient algorithms to solve a class of saddle...
research
07/14/2023

Variance-reduced accelerated methods for decentralized stochastic double-regularized nonconvex strongly-concave minimax problems

In this paper, we consider the decentralized, stochastic nonconvex stron...
research
09/18/2020

Hybrid Stochastic-Deterministic Minibatch Proximal Gradient: Less-Than-Single-Pass Optimization with Nearly Optimal Generalization

Stochastic variance-reduced gradient (SVRG) algorithms have been shown t...
research
07/08/2022

Tackling Data Heterogeneity: A New Unified Framework for Decentralized SGD with Sample-induced Topology

We develop a general framework unifying several gradient-based stochasti...
research
02/18/2021

SVRG Meets AdaGrad: Painless Variance Reduction

Variance reduction (VR) methods for finite-sum minimization typically re...
research
07/11/2022

Randomized Kaczmarz Method for Single Particle X-ray Image Phase Retrieval

In this paper, we investigate phase retrieval algorithm for the single p...

Please sign up or login with your details

Forgot password? Click here to reset