SemiFL: Communication Efficient Semi-Supervised Federated Learning with Unlabeled Clients

06/02/2021
by   Enmao Diao, et al.
0

Federated Learning allows training machine learning models by using the computation and private data resources of a large number of distributed clients such as smartphones and IoT devices. Most existing works on Federated Learning (FL) assume the clients have ground-truth labels. However, in many practical scenarios, clients may be unable to label task-specific data, e.g., due to lack of expertise. In this work, we consider a server that hosts a labeled dataset, and wishes to leverage clients with unlabeled data for supervised learning. We propose a new Federated Learning framework referred to as SemiFL in order to address the problem of Semi-Supervised Federated Learning (SSFL). In SemiFL, clients have completely unlabeled data, while the server has a small amount of labeled data. SemiFL is communication efficient since it separates the training of server-side supervised data and client-side unsupervised data. We demonstrate various efficient strategies of SemiFL that enhance learning performance. Extensive empirical evaluations demonstrate that our communication efficient method can significantly improve the performance of a labeled server with unlabeled clients. Moreover, we demonstrate that SemiFL can outperform many existing FL results trained with fully supervised data, and perform competitively with the state-of-the-art centralized Semi-Supervised Learning (SSL) methods. For instance, in standard communication efficient scenarios, our method can perform 93 samples at the server. Such accuracy is only 2 from 50000 fully labeled data, and it improves about 30 methods in the communication efficient setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2021

FedSEAL: Semi-Supervised Federated Learning with Self-Ensemble Learning and Negative Learning

Federated learning (FL), a popular decentralized and privacy-preserving ...
research
02/23/2023

FedIL: Federated Incremental Learning from Decentralized Unlabeled Data with Convergence Analysis

Most existing federated learning methods assume that clients have fully ...
research
07/29/2023

Efficient Semi-Supervised Federated Learning for Heterogeneous Participants

Federated Learning (FL) has emerged to allow multiple clients to collabo...
research
05/06/2023

Exploring One-shot Semi-supervised Federated Learning with A Pre-trained Diffusion Model

Federated learning is a privacy-preserving collaborative learning approa...
research
05/27/2022

Federated Semi-Supervised Learning with Prototypical Networks

With the increasing computing power of edge devices, Federated Learning ...
research
10/12/2022

FedProp: Cross-client Label Propagation for Federated Semi-supervised Learning

Federated learning (FL) allows multiple clients to jointly train a machi...
research
12/08/2020

RC-SSFL: Towards Robust and Communication-efficient Semi-supervised Federated Learning System

Federated Learning (FL) is an emerging decentralized artificial intellig...

Please sign up or login with your details

Forgot password? Click here to reset