Improving Contrastive Learning by Visualizing Feature Transformation

08/06/2021
by   Rui Zhu, et al.
0

Contrastive learning, which aims at minimizing the distance between positive pairs while maximizing that of negative ones, has been widely and successfully applied in unsupervised feature learning, where the design of positive and negative (pos/neg) pairs is one of its keys. In this paper, we attempt to devise a feature-level data manipulation, differing from data augmentation, to enhance the generic contrastive self-supervised learning. To this end, we first design a visualization scheme for pos/neg score (Pos/neg score indicates cosine similarity of pos/neg pair.) distribution, which enables us to analyze, interpret and understand the learning process. To our knowledge, this is the first attempt of its kind. More importantly, leveraging this tool, we gain some significant observations, which inspire our novel Feature Transformation proposals including the extrapolation of positives. This operation creates harder positives to boost the learning because hard positives enable the model to be more view-invariant. Besides, we propose the interpolation among negatives, which provides diversified negatives and makes the model more discriminative. It is the first attempt to deal with both challenges simultaneously. Experiment results show that our proposed Feature Transformation can improve at least 6.0 baseline, and about 2.0 Transferring to the downstream tasks successfully demonstrate our model is less task-bias. Visualization tools and codes https://github.com/DTennant/CL-Visualizing-Feature-Transformation .

READ FULL TEXT

page 15

page 16

research
10/23/2022

Rethinking Rotation in Self-Supervised Contrastive Learning: Adaptive Positive or Negative Data Augmentation

Rotation is frequently listed as a candidate for data augmentation in co...
research
10/13/2020

MixCo: Mix-up Contrastive Learning for Visual Representation

Contrastive learning has shown remarkable results in recent self-supervi...
research
06/07/2023

ScoreCL: Augmentation-Adaptive Contrastive Learning via Score-Matching Function

Self-supervised contrastive learning (CL) has achieved state-of-the-art ...
research
03/27/2022

CaCo: Both Positive and Negative Samples are Directly Learnable via Cooperative-adversarial Contrastive Learning

As a representative self-supervised method, contrastive learning has ach...
research
07/22/2023

Hallucination Improves the Performance of Unsupervised Visual Representation Learning

Contrastive learning models based on Siamese structure have demonstrated...
research
11/15/2022

Masked Reconstruction Contrastive Learning with Information Bottleneck Principle

Contrastive learning (CL) has shown great power in self-supervised learn...

Please sign up or login with your details

Forgot password? Click here to reset