Lower Bounds and Accelerated Algorithms in Distributed Stochastic Optimization with Communication Compression

05/12/2023
by   Yutong He, et al.
0

Communication compression is an essential strategy for alleviating communication overhead by reducing the volume of information exchanged between computing nodes in large-scale distributed stochastic optimization. Although numerous algorithms with convergence guarantees have been obtained, the optimal performance limit under communication compression remains unclear. In this paper, we investigate the performance limit of distributed stochastic optimization algorithms employing communication compression. We focus on two main types of compressors, unbiased and contractive, and address the best-possible convergence rates one can obtain with these compressors. We establish the lower bounds for the convergence rates of distributed stochastic optimization in six different settings, combining strongly-convex, generally-convex, or non-convex functions with unbiased or contractive compressor types. To bridge the gap between lower bounds and existing algorithms' rates, we propose NEOLITHIC, a nearly optimal algorithm with compression that achieves the established lower bounds up to logarithmic factors under mild conditions. Extensive experimental results support our theoretical findings. This work provides insights into the theoretical limitations of existing compressors and motivates further research into fundamentally new compressor properties.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/08/2022

Lower Bounds and Nearly Optimal Algorithms in Distributed Learning with Communication Compression

Recent advances in distributed optimization and learning have shown that...
research
05/25/2023

Unbiased Compression Saves Communication in Distributed Optimization: When and How Much?

Communication compression is a common technique in distributed optimizat...
research
01/24/2020

Limits on Gradient Compression for Stochastic Optimization

We consider stochastic optimization over ℓ_p spaces using access to a fi...
research
08/24/2016

AIDE: Fast and Communication Efficient Distributed Optimization

In this paper, we present two new communication-efficient methods for di...
research
01/11/2023

Compression for Distributed Optimization and Timely Updates

The goal of this thesis is to study the compression problems arising in ...
research
10/20/2018

Learning Models with Uniform Performance via Distributionally Robust Optimization

A common goal in statistics and machine learning is to learn models that...
research
07/02/2020

Federated Learning with Compression: Unified Analysis and Sharp Guarantees

In federated learning, communication cost is often a critical bottleneck...

Please sign up or login with your details

Forgot password? Click here to reset