FedOBD: Opportunistic Block Dropout for Efficiently Training Large-scale Neural Networks through Federated Learning

08/10/2022
by   Yuanyuan Chen, et al.
0

Large-scale neural networks possess considerable expressive power. They are well-suited for complex learning tasks in industrial applications. However, large-scale models pose significant challenges for training under the current Federated Learning (FL) paradigm. Existing approaches for efficient FL training often leverage model parameter dropout. However, manipulating individual model parameters is not only inefficient in meaningfully reducing the communication overhead when training large-scale FL models, but may also be detrimental to the scaling efforts and model performance as shown by recent research. To address these issues, we propose the Federated Opportunistic Block Dropout (FedOBD) approach. The key novelty is that it decomposes large-scale models into semantic blocks so that FL participants can opportunistically upload quantized blocks, which are deemed to be significant towards training the model, to the FL server for aggregation. Extensive experiments evaluating FedOBD against five state-of-the-art approaches based on multiple real-world datasets show that it reduces the overall communication overhead by more than 70 highest test accuracy. To the best of our knowledge, FedOBD is the first approach to perform dropout on FL models at the block level rather than at the individual parameter level.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2023

Efficient Training of Large-scale Industrial Fault Diagnostic Models through Federated Opportunistic Block Dropout

Artificial intelligence (AI)-empowered industrial fault diagnostics is i...
research
09/30/2021

Federated Dropout – A Simple Approach for Enabling Federated Learning on Resource Constrained Devices

Federated learning (FL) is a popular framework for training an AI model ...
research
01/26/2022

Fast Server Learning Rate Tuning for Coded Federated Dropout

In cross-device Federated Learning (FL), clients with low computational ...
research
08/07/2023

The Prospect of Enhancing Large-Scale Heterogeneous Federated Learning with Transformers

Federated learning (FL) addresses data privacy concerns by enabling coll...
research
01/05/2022

Towards Understanding Quality Challenges of the Federated Learning: A First Look from the Lens of Robustness

Federated learning (FL) is a widely adopted distributed learning paradig...
research
09/29/2021

LightSecAgg: Rethinking Secure Aggregation in Federated Learning

Secure model aggregation is a key component of federated learning (FL) t...
research
02/26/2021

FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout

Federated Learning (FL) has been gaining significant traction across dif...

Please sign up or login with your details

Forgot password? Click here to reset