MP-FedCL: Multi-Prototype Federated Contrastive Learning for Edge Intelligence

04/01/2023
by   Yu Qiao, et al.
0

Federated learning-assisted edge intelligence enables privacy protection in modern intelligent services. However, not Independent and Identically Distributed (non-IID) distribution among edge clients can impair the local model performance. The existing single prototype-based strategy represents a sample by using the mean of the feature space. However, feature spaces are usually not clustered, and a single prototype may not represent a sample well. Motivated by this, this paper proposes a multi-prototype federated contrastive learning approach (MP-FedCL) which demonstrates the effectiveness of using a multi-prototype strategy over a single-prototype under non-IID settings, including both label and feature skewness. Specifically, a multi-prototype computation strategy based on k-means is first proposed to capture different embedding representations for each class space, using multiple prototypes (k centroids) to represent a class in the embedding space. In each global round, the computed multiple prototypes and their respective model parameters are sent to the edge server for aggregation into a global prototype pool, which is then sent back to all clients to guide their local training. Finally, local training for each client minimizes their own supervised learning tasks and learns from shared prototypes in the global prototype pool through supervised contrastive learning, which encourages them to learn knowledge related to their own class from others and reduces the absorption of unrelated knowledge in each global iteration. Experimental results on MNIST, Digit-5, Office-10, and DomainNet show that our method outperforms multiple baselines, with an average test accuracy improvement of about 4.6% and 10.4% under feature and label non-IID distributions, respectively.

READ FULL TEXT

page 1

page 2

page 9

page 10

page 11

research
07/20/2023

Boosting Federated Learning Convergence with Prototype Regularization

As a distributed machine learning technique, federated learning (FL) req...
research
11/16/2022

Dual Class-Aware Contrastive Federated Semi-Supervised Learning

Federated semi-supervised learning (FSSL), facilitates labeled clients a...
research
11/21/2021

Distributed Unsupervised Visual Representation Learning with Fused Features

Federated learning (FL) enables distributed clients to learn a shared mo...
research
01/14/2023

FedSSC: Shared Supervised-Contrastive Federated Learning

Federated learning is widely used to perform decentralized training of a...
research
07/26/2023

HyperFed: Hyperbolic Prototypes Exploration with Consistent Aggregation for Non-IID Data in Federated Learning

Federated learning (FL) collaboratively models user data in a decentrali...
research
03/11/2023

Stabilizing and Improving Federated Learning with Non-IID Data and Client Dropout

The label distribution skew induced data heterogeniety has been shown to...
research
08/07/2023

Cross-Silo Prototypical Calibration for Federated Learning with Non-IID Data

Federated Learning aims to learn a global model on the server side that ...

Please sign up or login with your details

Forgot password? Click here to reset