Does Decentralized Learning with Non-IID Unlabeled Data Benefit from Self Supervision?

10/20/2022
by   Lirui Wang, et al.
0

Decentralized learning has been advocated and widely deployed to make efficient use of distributed datasets, with an extensive focus on supervised learning (SL) problems. Unfortunately, the majority of real-world data are unlabeled and can be highly heterogeneous across sources. In this work, we carefully study decentralized learning with unlabeled data through the lens of self-supervised learning (SSL), specifically contrastive visual representation learning. We study the effectiveness of a range of contrastive learning algorithms under decentralized learning settings, on relatively large-scale datasets including ImageNet-100, MS-COCO, and a new real-world robotic warehouse dataset. Our experiments show that the decentralized SSL (Dec-SSL) approach is robust to the heterogeneity of decentralized datasets, and learns useful representation for object classification, detection, and segmentation tasks. This robustness makes it possible to significantly reduce communication and reduce the participation ratio of data sources with only minimal drops in performance. Interestingly, using the same amount of data, the representation learned by Dec-SSL can not only perform on par with that learned by centralized SSL which requires communication and excessive data storage costs, but also sometimes outperform representations extracted from decentralized SL which requires extra knowledge about the data labels. Finally, we provide theoretical insights into understanding why data heterogeneity is less of a concern for Dec-SSL objectives, and introduce feature alignment and clustering techniques to develop a new Dec-SSL algorithm that further improves the performance, in the face of highly non-IID data. Our study presents positive evidence to embrace unlabeled data in decentralized learning, and we hope to provide new insights into whether and why decentralized SSL is effective.

READ FULL TEXT

page 20

page 23

page 24

page 25

research
02/28/2023

RoPAWS: Robust Semi-supervised Representation Learning from Uncurated Data

Semi-supervised learning aims to train a model using limited labels. Sta...
research
12/02/2022

FedCoCo: A Memory Efficient Federated Self-supervised Framework for On-Device Visual Representation Learning

The ubiquity of edge devices has led to a growing amount of unlabeled da...
research
05/25/2020

Supervised Convex Clustering

Clustering has long been a popular unsupervised learning approach to ide...
research
02/22/2023

Saliency Guided Contrastive Learning on Scene Images

Self-supervised learning holds promise in leveraging large numbers of un...
research
10/19/2021

Constrained Mean Shift for Representation Learning

We are interested in representation learning from labeled or unlabeled d...
research
06/20/2023

Understanding Contrastive Learning Through the Lens of Margins

Self-supervised learning, or SSL, holds the key to expanding the usage o...
research
05/07/2018

Holarchic Structures for Decentralized Deep Learning - A Performance Analysis

Structure plays a key role in learning performance. In centralized compu...

Please sign up or login with your details

Forgot password? Click here to reset