An Efficient Virtual Data Generation Method for Reducing Communication in Federated Learning

06/21/2023
by   Cheng Yang, et al.
0

Communication overhead is one of the major challenges in Federated Learning(FL). A few classical schemes assume the server can extract the auxiliary information about training data of the participants from the local models to construct a central dummy dataset. The server uses the dummy dataset to finetune aggregated global model to achieve the target test accuracy in fewer communication rounds. In this paper, we summarize the above solutions into a data-based communication-efficient FL framework. The key of the proposed framework is to design an efficient extraction module(EM) which ensures the dummy dataset has a positive effect on finetuning aggregated global model. Different from the existing methods that use generator to design EM, our proposed method, FedINIBoost borrows the idea of gradient match to construct EM. Specifically, FedINIBoost builds a proxy dataset of the real dataset in two steps for each participant at each communication round. Then the server aggregates all the proxy datasets to form a central dummy dataset, which is used to finetune aggregated global model. Extensive experiments verify the superiority of our method compared with the existing classical method, FedAVG, FedProx, Moon and FedFTG. Moreover, FedINIBoost plays a significant role in finetuning the performance of aggregated global model at the initial stage of FL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2021

Data-Free Knowledge Distillation for Heterogeneous Federated Learning

Federated Learning (FL) is a decentralized machine-learning paradigm, in...
research
10/31/2022

A-LAQ: Adaptive Lazily Aggregated Quantized Gradient

Federated Learning (FL) plays a prominent role in solving machine learni...
research
10/05/2021

Communication-Efficient Federated Learning with Binary Neural Networks

Federated learning (FL) is a privacy-preserving machine learning setting...
research
04/24/2023

More Communication Does Not Result in Smaller Generalization Error in Federated Learning

We study the generalization error of statistical learning models in a Fe...
research
07/20/2022

FedDM: Iterative Distribution Matching for Communication-Efficient Federated Learning

Federated learning (FL) has recently attracted increasing attention from...
research
05/25/2022

VeriFi: Towards Verifiable Federated Unlearning

Federated learning (FL) is a collaborative learning paradigm where parti...
research
02/09/2023

Communication-Efficient Federated Hypergradient Computation via Aggregated Iterative Differentiation

Federated bilevel optimization has attracted increasing attention due to...

Please sign up or login with your details

Forgot password? Click here to reset