Stochastic Gradient Methods with Compressed Communication for Decentralized Saddle Point Problems

05/28/2022
by   Chhavi Sharma, et al.
0

We propose two stochastic gradient algorithms to solve a class of saddle-point problems in a decentralized setting (without a central server). The proposed algorithms are the first to achieve sub-linear/linear computation and communication complexities using respectively stochastic gradient/stochastic variance reduced gradient oracles with compressed information exchange to solve non-smooth strongly-convex strongly-concave saddle-point problems in decentralized setting. Our first algorithm is a Restart-based Decentralized Proximal Stochastic Gradient method with Compression (C-RDPSG) for general stochastic settings. We provide rigorous theoretical guarantees of C-RDPSG with gradient computation complexity and communication complexity of order 𝒪( (1+δ)^4 1/L^2κ_f^2κ_g^2 1/ϵ ), to achieve an ϵ-accurate saddle-point solution, where δ denotes the compression factor, κ_f and κ_g denote respectively the condition numbers of objective function and communication graph, and L denotes the smoothness parameter of the smooth part of the objective function. Next, we present a Decentralized Proximal Stochastic Variance Reduced Gradient algorithm with Compression (C-DPSVRG) for finite sum setting which exhibits gradient computation complexity and communication complexity of order 𝒪((1+δ)κ_f^2 κ_g log(1/ϵ)). Extensive numerical experiments show competitive performance of the proposed algorithms and provide support to the theoretical results obtained.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2021

Decentralized Composite Optimization with Compression

Decentralized optimization and communication compression have exhibited ...
research
09/02/2023

Switch and Conquer: Efficient Algorithms By Switching Stochastic Gradient Oracles For Decentralized Saddle Point Problems

We consider a class of non-smooth strongly convex-strongly concave saddl...
research
10/09/2021

An Empirical Study on Compressed Decentralized Stochastic Gradient Algorithms with Overparameterized Models

This paper considers decentralized optimization with application to mach...
research
09/09/2020

Variance Reduced EXTRA and DIGing and Their Optimal Acceleration for Strongly Convex Decentralized Optimization

We study stochastic decentralized optimization for the problem of traini...
research
02/01/2022

DoCoM-SGT: Doubly Compressed Momentum-assisted Stochastic Gradient Tracking Algorithm for Communication Efficient Decentralized Learning

This paper proposes the Doubly Compressed Momentum-assisted Stochastic G...
research
10/31/2019

A Decentralized Proximal Point-type Method for Saddle Point Problems

In this paper, we focus on solving a class of constrained non-convex non...
research
09/18/2020

Hybrid Stochastic-Deterministic Minibatch Proximal Gradient: Less-Than-Single-Pass Optimization with Nearly Optimal Generalization

Stochastic variance-reduced gradient (SVRG) algorithms have been shown t...

Please sign up or login with your details

Forgot password? Click here to reset