FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout

02/26/2021
by   Samuel Horvath, et al.
0

Federated Learning (FL) has been gaining significant traction across different ML tasks, ranging from vision to keyboard predictions. In large-scale deployments, client heterogeneity is a fact, and constitutes a primary problem for fairness, training performance and accuracy. Although significant efforts have been made into tackling statistical data heterogeneity, the diversity in the processing capabilities and network bandwidth of clients, termed as system heterogeneity, has remained largely unexplored. Current solutions either disregard a large portion of available devices or set a uniform limit on the model's capacity, restricted by the least capable participants. In this work, we introduce Ordered Dropout, a mechanism that achieves an ordered, nested representation of knowledge in Neural Networks and enables the extraction of lower footprint submodels without the need of retraining. We further show that for linear maps our Ordered Dropout is equivalent to SVD. We employ this technique, along with a self-distillation methodology, in the realm of FL in a framework called FjORD. FjORD alleviates the problem of client system heterogeneity by tailoring the model width to the client's capabilities. Extensive evaluation on both CNNs and RNNs across diverse modalities shows that FjORD consistently leads to significant performance gains over state-of-the-art baselines, while maintaining its nested structure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/29/2022

Auxo: Heterogeneity-Mitigating Federated Learning via Scalable Client Clustering

Federated learning (FL) is an emerging machine learning (ML) paradigm th...
research
07/11/2023

Benchmarking Algorithms for Federated Domain Generalization

While prior domain generalization (DG) benchmarks consider train-test da...
research
11/07/2022

Closing the Gap between Client and Global Model Performance in Heterogeneous Federated Learning

The heterogeneity of hardware and data is a well-known and studied probl...
research
05/31/2021

Unifying Distillation with Personalization in Federated Learning

Federated learning (FL) is a decentralized privacy-preserving learning t...
research
08/10/2022

FedOBD: Opportunistic Block Dropout for Efficiently Training Large-scale Neural Networks through Federated Learning

Large-scale neural networks possess considerable expressive power. They ...
research
02/05/2014

Learning Ordered Representations with Nested Dropout

In this paper, we study ordered representations of data in which differe...

Please sign up or login with your details

Forgot password? Click here to reset