z-SignFedAvg: A Unified Stochastic Sign-based Compression for Federated Learning

02/06/2023
by   Zhiwei Tang, et al.
0

Federated Learning (FL) is a promising privacy-preserving distributed learning paradigm but suffers from high communication cost when training large-scale machine learning models. Sign-based methods, such as SignSGD <cit.>, have been proposed as a biased gradient compression technique for reducing the communication cost. However, sign-based algorithms could diverge under heterogeneous data, which thus motivated the development of advanced techniques, such as the error-feedback method and stochastic sign-based compression, to fix this issue. Nevertheless, these methods still suffer from slower convergence rates. Besides, none of them allows multiple local SGD updates like FedAvg <cit.>. In this paper, we propose a novel noisy perturbation scheme with a general symmetric noise distribution for sign-based compression, which not only allows one to flexibly control the tradeoff between gradient bias and convergence performance, but also provides a unified viewpoint to existing stochastic sign-based methods. More importantly, the unified noisy perturbation scheme enables the development of the very first sign-based FedAvg algorithm (z-SignFedAvg) to accelerate the convergence. Theoretically, we show that z-SignFedAvg achieves a faster convergence rate than existing sign-based methods and, under the uniformly distributed noise, can enjoy the same convergence rate as its uncompressed counterpart. Extensive experiments are conducted to demonstrate that the z-SignFedAvg can achieve competitive empirical performance on real datasets and outperforms existing schemes.

READ FULL TEXT

page 26

page 27

page 29

page 32

research
02/25/2020

Stochastic-Sign SGD for Federated Learning with Theoretical Guarantees

Federated learning (FL) has emerged as a prominent distributed learning ...
research
02/27/2023

Communication-efficient Federated Learning with Single-Step Synthetic Features Compressor for Faster Convergence

Reducing communication overhead in federated learning (FL) is challengin...
research
01/28/2019

Error Feedback Fixes SignSGD and other Gradient Compression Schemes

Sign-based algorithms (e.g. signSGD) have been proposed as a biased grad...
research
02/19/2023

Magnitude Matters: Fixing SIGNSGD Through Magnitude-Aware Sparsification in the Presence of Data Heterogeneity

Communication overhead has become one of the major bottlenecks in the di...
research
12/10/2021

Federated Two-stage Learning with Sign-based Voting

Federated learning is a distributed machine learning mechanism where loc...
research
08/02/2023

Compressed and distributed least-squares regression: convergence rates with applications to Federated Learning

In this paper, we investigate the impact of compression on stochastic gr...
research
04/14/2022

Sign Bit is Enough: A Learning Synchronization Framework for Multi-hop All-reduce with Ultimate Compression

Traditional one-bit compressed stochastic gradient descent can not be di...

Please sign up or login with your details

Forgot password? Click here to reset