FedDD: Toward Communication-efficient Federated Learning with Differential Parameter Dropout

08/31/2023
by   Zhiying Feng, et al.
0

Federated Learning (FL) requires frequent exchange of model parameters, which leads to long communication delay, especially when the network environments of clients vary greatly. Moreover, the parameter server needs to wait for the slowest client (i.e., straggler, which may have the largest model size, lowest computing capability or worst network condition) to upload parameters, which may significantly degrade the communication efficiency. Commonly-used client selection methods such as partial client selection would lead to the waste of computing resources and weaken the generalization of the global model. To tackle this problem, along a different line, in this paper, we advocate the approach of model parameter dropout instead of client selection, and accordingly propose a novel framework of Federated learning scheme with Differential parameter Dropout (FedDD). FedDD consists of two key modules: dropout rate allocation and uploaded parameter selection, which will optimize the model parameter uploading ratios tailored to different clients' heterogeneous conditions and also select the proper set of important model parameters for uploading subject to clients' dropout rate constraints. Specifically, the dropout rate allocation is formulated as a convex optimization problem, taking system heterogeneity, data heterogeneity, and model heterogeneity among clients into consideration. The uploaded parameter selection strategy prioritizes on eliciting important parameters for uploading to speedup convergence. Furthermore, we theoretically analyze the convergence of the proposed FedDD scheme. Extensive performance evaluations demonstrate that the proposed FedDD scheme can achieve outstanding performances in both communication efficiency and model convergence, and also possesses a strong generalization capability to data of rare classes.

READ FULL TEXT

page 1

page 11

page 16

page 18

research
12/19/2022

Adaptive Control of Client Selection and Gradient Compression for Efficient Federated Learning

Federated learning (FL) allows multiple clients cooperatively train mode...
research
05/26/2022

Friends to Help: Saving Federated Learning from Client Dropout

Federated learning (FL) is an outstanding distributed machine learning f...
research
01/16/2023

HiFlash: Communication-Efficient Hierarchical Federated Learning with Adaptive Staleness Control and Heterogeneity-aware Client-Edge Association

Federated learning (FL) is a promising paradigm that enables collaborati...
research
10/03/2020

Client Selection in Federated Learning: Convergence Analysis and Power-of-Choice Selection Strategies

Federated learning is a distributed optimization paradigm that enables a...
research
01/26/2022

Fast Server Learning Rate Tuning for Coded Federated Dropout

In cross-device Federated Learning (FL), clients with low computational ...
research
05/18/2023

Client Selection for Federated Policy Optimization with Environment Heterogeneity

The development of Policy Iteration (PI) has inspired many recent algori...
research
12/05/2022

Partial Variance Reduction improves Non-Convex Federated learning on heterogeneous data

Data heterogeneity across clients is a key challenge in federated learni...

Please sign up or login with your details

Forgot password? Click here to reset