Divergence-aware Federated Self-Supervised Learning

04/09/2022
by   Weiming Zhuang, et al.
0

Self-supervised learning (SSL) is capable of learning remarkable representations from centrally available data. Recent works further implement federated learning with SSL to learn from rapidly growing decentralized unlabeled images (e.g., from cameras and phones), often resulted from privacy constraints. Extensive attention has been paid to SSL approaches based on Siamese networks. However, such an effort has not yet revealed deep insights into various fundamental building blocks for the federated self-supervised learning (FedSSL) architecture. We aim to fill in this gap via in-depth empirical study and propose a new method to tackle the non-independently and identically distributed (non-IID) data problem of decentralized data. Firstly, we introduce a generalized FedSSL framework that embraces existing SSL methods based on Siamese networks and presents flexibility catering to future methods. In this framework, a server coordinates multiple clients to conduct SSL training and periodically updates local models of clients with the aggregated global model. Using the framework, our study uncovers unique insights of FedSSL: 1) stop-gradient operation, previously reported to be essential, is not always necessary in FedSSL; 2) retaining local knowledge of clients in FedSSL is particularly beneficial for non-IID data. Inspired by the insights, we then propose a new approach for model update, Federated Divergence-aware Exponential Moving Average update (FedEMA). FedEMA updates local models of clients adaptively using EMA of the global model, where the decay rate is dynamically measured by model divergence. Extensive experiments demonstrate that FedEMA outperforms existing methods by 3-4 work will provide useful insights for future research.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2021

Federated Self-Supervised Contrastive Learning via Ensemble Similarity Distillation

This paper investigates the feasibility of learning good representation ...
research
05/25/2022

Federated Self-supervised Learning for Heterogeneous Clients

Federated Learning has become an important learning paradigm due to its ...
research
11/14/2022

Feature Correlation-guided Knowledge Transfer for Federated Self-supervised Learning

To eliminate the requirement of fully-labeled data for supervised model ...
research
11/24/2022

Knowledge-Aware Federated Active Learning with Non-IID Data

Federated learning enables multiple decentralized clients to learn colla...
research
08/14/2021

Collaborative Unsupervised Visual Representation Learning from Decentralized Data

Unsupervised representation learning has achieved outstanding performanc...
research
12/15/2021

LoSAC: An Efficient Local Stochastic Average Control Method for Federated Optimization

Federated optimization (FedOpt), which targets at collaboratively traini...
research
06/17/2022

Avoid Overfitting User Specific Information in Federated Keyword Spotting

Keyword spotting (KWS) aims to discriminate a specific wake-up word from...

Please sign up or login with your details

Forgot password? Click here to reset