Resource-aware Federated Learning using Knowledge Extraction and Multi-model Fusion

08/16/2022
by   Sixing Yu, et al.
0

With increasing concern about user data privacy, federated learning (FL) has been developed as a unique training paradigm for training machine learning models on edge devices without access to sensitive data. Traditional FL and existing methods directly employ aggregation methods on all edges of the same models and training devices for a cloud server. Although these methods protect data privacy, they are not capable of model heterogeneity, even ignore the heterogeneous computing power, and incur steep communication costs. In this paper, we purpose a resource-aware FL to aggregate an ensemble of local knowledge extracted from edge models, instead of aggregating the weights of each local model, which is then distilled into a robust global knowledge as the server model through knowledge distillation. The local model and the global knowledge are extracted into a tiny size knowledge network by deep mutual learning. Such knowledge extraction allows the edge client to deploy a resource-aware model and perform multi-model knowledge fusion while maintaining communication efficiency and model heterogeneity. Empirical results show that our approach has significantly improved over existing FL algorithms in terms of communication cost and generalization performance in heterogeneous data and models. Our approach reduces the communication cost of VGG-11 by up to 102× and ResNet-32 by up to 30× when training ResNet-20 as the knowledge network.

READ FULL TEXT
research
08/25/2023

Resource-Efficient Federated Learning for Heterogenous and Resource-Constrained Environments

Federated Learning (FL) is a privacy-enforcing sub-domain of machine lea...
research
06/12/2020

Ensemble Distillation for Robust Model Fusion in Federated Learning

Federated Learning (FL) is a machine learning setting where many devices...
research
07/28/2020

Group Knowledge Transfer: Collaborative Training of Large CNNs on the Edge

Scaling up the convolutional neural network (CNN) size (e.g., width, dep...
research
08/15/2023

FedCache: A Knowledge Cache-driven Federated Learning Architecture for Personalized Edge Intelligence

Edge Intelligence (EI) allows Artificial Intelligence (AI) applications ...
research
11/08/2022

Federated Learning Using Three-Operator ADMM

Federated learning (FL) has emerged as an instance of distributed machin...
research
12/05/2022

HierarchyFL: Heterogeneous Federated Learning via Hierarchical Self-Distillation

Federated learning (FL) has been recognized as a privacy-preserving dist...
research
11/14/2022

FedCL: Federated Multi-Phase Curriculum Learning to Synchronously Correlate User Heterogeneity

Federated Learning (FL) is a new decentralized learning used for trainin...

Please sign up or login with your details

Forgot password? Click here to reset