CosSGD: Nonlinear Quantization for Communication-efficient Federated Learning

12/15/2020
by   Yang He, et al.
0

Federated learning facilitates learning across clients without transferring local data on these clients to a central server. Despite the success of the federated learning method, it remains to improve further w.r.t communicating the most critical information to update a model under limited communication conditions, which can benefit this learning scheme into a wide range of application scenarios. In this work, we propose a nonlinear quantization for compressed stochastic gradient descent, which can be easily utilized in federated learning. Based on the proposed quantization, our system significantly reduces the communication cost by up to three orders of magnitude, while maintaining convergence and accuracy of the training process to a large extent. Extensive experiments are conducted on image classification and brain tumor semantic segmentation using the MNIST, CIFAR-10 and BraTS datasets where we show state-of-the-art effectiveness and impressive communication efficiency.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/05/2022

Communication-Efficient Adaptive Federated Learning

Federated learning is a machine learning training paradigm that enables ...
research
12/01/2020

Communication-Efficient Federated Distillation

Communication constraints are one of the major challenges preventing the...
research
11/29/2021

SPATL: Salient Parameter Aggregation and Transfer Learning for Heterogeneous Clients in Federated Learning

Efficient federated learning is one of the key challenges for training a...
research
05/13/2022

OFedQIT: Communication-Efficient Online Federated Learning via Quantization and Intermittent Transmission

Online federated learning (OFL) is a promising framework to collaborativ...
research
10/02/2022

SAGDA: Achieving 𝒪(ε^-2) Communication Complexity in Federated Min-Max Learning

To lower the communication complexity of federated min-max learning, a n...
research
03/15/2023

Communication-Efficient Design for Quantized Decentralized Federated Learning

Decentralized federated learning (DFL) is a variant of federated learnin...
research
09/12/2022

Communication-Efficient and Privacy-Preserving Feature-based Federated Transfer Learning

Federated learning has attracted growing interest as it preserves the cl...

Please sign up or login with your details

Forgot password? Click here to reset