Time-Correlated Sparsification for Communication-Efficient Federated Learning

01/21/2021
by   Emre Ozfatura, et al.
21

Federated learning (FL) enables multiple clients to collaboratively train a shared model without disclosing their local datasets. This is achieved by exchanging local model updates with the help of a parameter server (PS). However, due to the increasing size of the trained models, the communication load due to the iterative exchanges between the clients and the PS often becomes a bottleneck in the performance. Sparse communication is often employed to reduce the communication load, where only a small subset of the model updates are communicated from the clients to the PS. In this paper, we introduce a novel time-correlated sparsification (TCS) scheme, which builds upon the notion that sparse communication framework can be considered as identifying the most significant elements of the underlying model. Hence, TCS seeks a certain correlation between the sparse representations used at consecutive iterations in FL, so that the overhead due to encoding and transmission of the sparse representation can be significantly reduced without compromising the test accuracy. Through extensive simulations on the CIFAR-10 dataset, we show that TCS can achieve centralized training accuracy with 100 times sparsification, and up to 2000 times reduction in the communication load when employed together with quantization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2021

A Family of Hybrid Federated and Centralized Learning Architectures in Machine Learning

Many of the machine learning tasks focus on centralized learning (CL), w...
research
11/13/2020

Hybrid Federated and Centralized Learning

Many of the machine learning (ML) tasks are focused on centralized learn...
research
04/09/2022

Adaptive Differential Filters for Fast and Communication-Efficient Federated Learning

Federated learning (FL) scenarios inherently generate a large communicat...
research
01/28/2022

FedLite: A Scalable Approach for Federated Learning on Resource-constrained Clients

In classical federated learning, the clients contribute to the overall t...
research
02/27/2022

Graph-Assisted Communication-Efficient Ensemble Federated Learning

Communication efficiency arises as a necessity in federated learning due...
research
12/12/2021

Communication-Efficient Federated Learning for Neural Machine Translation

Training neural machine translation (NMT) models in federated learning (...
research
09/30/2022

Sparse Random Networks for Communication-Efficient Federated Learning

One main challenge in federated learning is the large communication cost...

Please sign up or login with your details

Forgot password? Click here to reset