Supernet Training for Federated Image Classification under System Heterogeneity

06/03/2022
by   Taehyeon Kim, et al.
1

Efficient deployment of deep neural networks across many devices and resource constraints, especially on edge devices, is one of the most challenging problems in the presence of data-privacy preservation issues. Conventional approaches have evolved to either improve a single global model while keeping each local training data decentralized (i.e., data-heterogeneity) or to train a once-for-all network that supports diverse architectural settings to address heterogeneous systems equipped with different computational capabilities (i.e., model-heterogeneity). However, little research has considered both directions simultaneously. In this work, we propose a novel framework to consider both scenarios, namely Federation of Supernet Training (FedSup), where clients send and receive a supernet whereby it contains all possible architectures sampled from itself. It is inspired by how averaging parameters in the model aggregation stage of Federated Learning (FL) is similar to weight-sharing in supernet training. Specifically, in the FedSup framework, a weight-sharing approach widely used in the training single shot model is combined with the averaging of Federated Learning (FedAvg). Under our framework, we present an efficient algorithm (E-FedSup) by sending the sub-model to clients in the broadcast stage for reducing communication costs and training overhead. We demonstrate several strategies to enhance supernet training in the FL environment and conduct extensive empirical evaluations. The resulting framework is shown to pave the way for the robustness of both data- and model-heterogeneity on several standard benchmarks.

READ FULL TEXT

page 2

page 7

research
10/03/2020

HeteroFL: Computation and Communication Efficient Federated Learning for Heterogeneous Clients

Federated Learning (FL) is a method of training machine learning models ...
research
07/12/2023

Tackling Computational Heterogeneity in FL: A Few Theoretical Insights

The future of machine learning lies in moving data collection along with...
research
02/23/2022

Bitwidth Heterogeneous Federated Learning with Progressive Weight Dequantization

In practical federated learning scenarios, the participating devices may...
research
08/21/2023

FedDAT: An Approach for Foundation Model Finetuning in Multi-Modal Heterogeneous Federated Learning

Recently, foundation models have exhibited remarkable advancements in mu...
research
12/29/2022

Graph Federated Learning for CIoT Devices in Smart Home Applications

This paper deals with the problem of statistical and system heterogeneit...
research
10/16/2018

Collaborative Deep Learning Across Multiple Data Centers

Valuable training data is often owned by independent organizations and l...
research
03/31/2023

Benchmarking FedAvg and FedCurv for Image Classification Tasks

Classic Machine Learning techniques require training on data available i...

Please sign up or login with your details

Forgot password? Click here to reset