Distillation-Based Semi-Supervised Federated Learning for Communication-Efficient Collaborative Training with Non-IID Private Data

08/14/2020
by   Sohei Itahara, et al.
0

This study develops a federated learning (FL) framework overcoming largely incremental communication costs due to model sizes in typical frameworks without compromising model performance. To this end, based on the idea of leveraging an unlabeled open dataset, we propose a distillation-based semi-supervised FL (DS-FL) algorithm that exchanges the outputs of local models among mobile devices, instead of model parameter exchange employed by the typical frameworks. In DS-FL, the communication cost depends only on the output dimensions of the models and does not scale up according to the model size. The exchanged model outputs are used to label each sample of the open dataset, which creates an additionally labeled dataset. Based on the new dataset, local models are further trained, and model performance is enhanced owing to the data augmentation effect. We further highlight that in DS-FL, the heterogeneity of the devices' dataset leads to ambiguous of each data sample and lowing of the training convergence. To prevent this, we propose entropy reduction averaging, where the aggregated model outputs are intentionally sharpened. Moreover, extensive experiments show that DS-FL reduces communication costs up to 99 relative to those of the FL benchmark while achieving similar or higher classification accuracy.

READ FULL TEXT
research
11/28/2018

Communication-Efficient On-Device Machine Learning: Federated Distillation and Augmentation under Non-IID Private Data

On-device machine learning (ML) enables the training process to exploit ...
research
07/29/2023

Efficient Semi-Supervised Federated Learning for Heterogeneous Participants

Federated Learning (FL) has emerged to allow multiple clients to collabo...
research
01/17/2023

FedCliP: Federated Learning with Client Pruning

The prevalent communication efficient federated learning (FL) frameworks...
research
08/08/2023

ConDistFL: Conditional Distillation for Federated Learning from Partially Annotated Data

Developing a generalized segmentation model capable of simultaneously de...
research
11/17/2021

FLSys: Toward an Open Ecosystem for Federated Learning Mobile Apps

This paper presents the design, implementation, and evaluation of FLSys,...
research
04/21/2020

Lottery Hypothesis based Unsupervised Pre-training for Model Compression in Federated Learning

Federated learning (FL) enables a neural network (NN) to be trained usin...
research
02/22/2023

Efficient Training of Large-scale Industrial Fault Diagnostic Models through Federated Opportunistic Block Dropout

Artificial intelligence (AI)-empowered industrial fault diagnostics is i...

Please sign up or login with your details

Forgot password? Click here to reset