Semantic-aware Representation Learning Via Probability Contrastive Loss

11/11/2021
by   Junjie Li, et al.
0

Recent feature contrastive learning (FCL) has shown promising performance in unsupervised representation learning. For the close-set representation learning where labeled data and unlabeled data belong to the same semantic space, however, FCL cannot show overwhelming gains due to not involving the class semantics during optimization. Consequently, the produced features do not guarantee to be easily classified by the class weights learned from labeled data although they are information-rich. To tackle this issue, we propose a novel probability contrastive learning (PCL) in this paper, which not only produces rich features but also enforces them to be distributed around the class prototypes. Specifically, we propose to use the output probabilities after softmax to perform contrastive learning instead of the extracted features in FCL. Evidently, such a way can exploit the class semantics during optimization. Moreover, we propose to remove the ℓ_2 normalization in the traditional FCL and directly use the ℓ_1-normalized probability for contrastive learning. Our proposed PCL is simple and effective. We conduct extensive experiments on three close-set image classification tasks, i.e., unsupervised domain adaptation, semi-supervised learning, and semi-supervised domain adaptation. The results on multiple datasets demonstrate that our PCL can consistently get considerable gains and achieves the state-of-the-art performance for all three tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2021

Semi-Supervised Semantic Segmentation with Pixel-Level Contrastive Learning from a Class-wise Memory Bank

This work presents a novel approach for semi-supervised semantic segment...
research
10/06/2021

ActiveMatch: End-to-end Semi-supervised Active Representation Learning

Semi-supervised learning (SSL) is an efficient framework that can train ...
research
06/05/2022

Semi-Supervised Learning for Mars Imagery Classification and Segmentation

With the progress of Mars exploration, numerous Mars image data are coll...
research
11/28/2022

Deep Semi-supervised Learning with Double-Contrast of Features and Semantics

In recent years, the field of intelligent transportation systems (ITS) h...
research
10/17/2020

i-Mix: A Strategy for Regularizing Contrastive Representation Learning

Contrastive representation learning has shown to be an effective way of ...
research
03/19/2021

UniMoCo: Unsupervised, Semi-Supervised and Full-Supervised Visual Representation Learning

Momentum Contrast (MoCo) achieves great success for unsupervised visual ...
research
04/04/2022

Con^2DA: Simplifying Semi-supervised Domain Adaptation by Learning Consistent and Contrastive Feature Representations

In this work, we present Con^2DA, a simple framework that extends recent...

Please sign up or login with your details

Forgot password? Click here to reset