A Theory-Driven Self-Labeling Refinement Method for Contrastive Representation Learning

06/28/2021
by   Pan Zhou, et al.
0

For an image query, unsupervised contrastive learning labels crops of the same image as positives, and other image crops as negatives. Although intuitive, such a native label assignment strategy cannot reveal the underlying semantic similarity between a query and its positives and negatives, and impairs performance, since some negatives are semantically similar to the query or even share the same semantic class as the query. In this work, we first prove that for contrastive learning, inaccurate label assignment heavily impairs its generalization for semantic instance discrimination, while accurate labels benefit its generalization. Inspired by this theory, we propose a novel self-labeling refinement approach for contrastive learning. It improves the label quality via two complementary modules: (i) self-labeling refinery (SLR) to generate accurate labels and (ii) momentum mixup (MM) to enhance similarity between query and its positive. SLR uses a positive of a query to estimate semantic similarity between a query and its positive and negatives, and combines estimated similarity with vanilla label assignment in contrastive learning to iteratively generate more accurate and informative soft labels. We theoretically show that our SLR can exactly recover the true semantic labels of label-corrupted data, and supervises networks to achieve zero prediction error on classification tasks. MM randomly combines queries and positives to increase semantic similarity between the generated virtual queries and their positives so as to improves label accuracy. Experimental results on CIFAR10, ImageNet, VOC and COCO show the effectiveness of our method. PyTorch code and model will be released online.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

CO2: Consistent Contrast for Unsupervised Visual Representation Learning

Contrastive learning has been adopted as a core method for unsupervised ...
research
11/29/2021

Similarity Contrastive Estimation for Self-Supervised Soft Contrastive Learning

Contrastive representation learning has proven to be an effective self-s...
research
01/12/2023

SemPPL: Predicting pseudo-labels for better contrastive representations

Learning from large amounts of unsupervised data and a small amount of s...
research
12/04/2020

Hierarchical Semantic Aggregation for Contrastive Representation Learning

Self-supervised learning based on instance discrimination has shown rema...
research
05/10/2023

Towards Effective Visual Representations for Partial-Label Learning

Under partial-label learning (PLL) where, for each training instance, on...
research
08/06/2022

Contrastive Positive Mining for Unsupervised 3D Action Representation Learning

Recent contrastive based 3D action representation learning has made grea...
research
01/30/2022

Similarity and Generalization: From Noise to Corruption

Contrastive learning aims to extract distinctive features from data by f...

Please sign up or login with your details

Forgot password? Click here to reset