Slashing Communication Traffic in Federated Learning by Transmitting Clustered Model Updates

05/10/2021
by   Laizhong Cui, et al.
0

Federated Learning (FL) is an emerging decentralized learning framework through which multiple clients can collaboratively train a learning model. However, a major obstacle that impedes the wide deployment of FL lies in massive communication traffic. To train high dimensional machine learning models (such as CNN models), heavy communication traffic can be incurred by exchanging model updates via the Internet between clients and the parameter server (PS), implying that the network resource can be easily exhausted. Compressing model updates is an effective way to reduce the traffic amount. However, a flexible unbiased compression algorithm applicable for both uplink and downlink compression in FL is still absent from existing works. In this work, we devise the Model Update Compression by Soft Clustering (MUCSC) algorithm to compress model updates transmitted between clients and the PS. In MUCSC, it is only necessary to transmit cluster centroids and the cluster ID of each model update. Moreover, we prove that: 1) The compressed model updates are unbiased estimation of their original values so that the convergence rate by transmitting compressed model updates is unchanged; 2) MUCSC can guarantee that the influence of the compression error on the model accuracy is minimized. Then, we further propose the boosted MUCSC (B-MUCSC) algorithm, a biased compression algorithm that can achieve an extremely high compression rate by grouping insignificant model updates into a super cluster. B-MUCSC is suitable for scenarios with very scarce network resource. Ultimately, we conduct extensive experiments with the CIFAR-10 and FEMNIST datasets to demonstrate that our algorithms can not only substantially reduce the volume of communication traffic in FL, but also improve the training efficiency in practical networks.

READ FULL TEXT
research
08/12/2022

A Fast Blockchain-based Federated Learning Framework with Compressed Communications

Recently, blockchain-based federated learning (BFL) has attracted intens...
research
12/13/2021

Optimal Rate Adaption in Federated Learning with Compressed Communications

Federated Learning (FL) incurs high communication overhead, which can be...
research
08/16/2023

Stochastic Controlled Averaging for Federated Learning with Communication Compression

Communication compression, a technique aiming to reduce the information ...
research
04/09/2022

Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

Federated learning (FL) scenarios inherently generate a large communicat...
research
12/16/2022

Federated Learning with Flexible Control

Federated learning (FL) enables distributed model training from local da...
research
02/06/2022

Lossy Gradient Compression: How Much Accuracy Can One Bit Buy?

In federated learning (FL), a global model is trained at a Parameter Ser...
research
08/12/2021

Communication Optimization in Large Scale Federated Learning using Autoencoder Compressed Weight Updates

Federated Learning (FL) solves many of this decade's concerns regarding ...

Please sign up or login with your details

Forgot password? Click here to reset