Fast Server Learning Rate Tuning for Coded Federated Dropout

01/26/2022
by   Giacomo Verardo, et al.
0

In cross-device Federated Learning (FL), clients with low computational power train a common machine model by exchanging parameters updates instead of potentially private data. Federated Dropout (FD) is a technique that improves the communication efficiency of a FL session by selecting a subset of model variables to be updated in each training round. However, FD produces considerably lower accuracy and higher convergence time compared to standard FL. In this paper, we leverage coding theory to enhance FD by allowing a different sub-model to be used at each client. We also show that by carefully tuning the server learning rate hyper-parameter, we can achieve higher training speed and up to the same final accuracy of the no dropout case. For the EMNIST dataset, our mechanism achieves 99.6 case while requiring 2.43x less bandwidth to achieve this accuracy level.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2022

Friends to Help: Saving Federated Learning from Client Dropout

Federated learning (FL) is an outstanding distributed machine learning f...
research
10/06/2022

DReS-FL: Dropout-Resilient Secure Federated Learning for Non-IID Clients via Secret Data Sharing

Federated learning (FL) strives to enable collaborative training of mach...
research
07/05/2023

FLuID: Mitigating Stragglers in Federated Learning using Invariant Dropout

Federated Learning (FL) allows machine learning models to train locally ...
research
12/16/2021

DISTREAL: Distributed Resource-Aware Learning in Heterogeneous Systems

We study the problem of distributed training of neural networks (NNs) on...
research
08/10/2022

FedOBD: Opportunistic Block Dropout for Efficiently Training Large-scale Neural Networks through Federated Learning

Large-scale neural networks possess considerable expressive power. They ...
research
08/31/2023

FedDD: Toward Communication-efficient Federated Learning with Differential Parameter Dropout

Federated Learning (FL) requires frequent exchange of model parameters, ...
research
08/30/2022

Reducing Impacts of System Heterogeneity in Federated Learning using Weight Update Magnitudes

The widespread adoption of handheld devices have fueled rapid growth in ...

Please sign up or login with your details

Forgot password? Click here to reset