What makes for good views for contrastive learning

05/20/2020
by   Yonglong Tian, et al.
16

Contrastive learning between multiple views of the data has recently achieved state of the art performance in the field of self-supervised representation learning. Despite its success, the influence of different view choices has been less studied. In this paper, we use empirical analysis to better understand the importance of view selection, and argue that we should reduce the mutual information (MI) between views while keeping task-relevant information intact. To verify this hypothesis, we devise unsupervised and semi-supervised frameworks that learn effective views by aiming to reduce their MI. We also consider data augmentation as a way to reduce MI, and show that increasing data augmentation indeed leads to decreasing MI and improves downstream classification accuracy. As a by-product, we also achieve a new state-of-the-art accuracy on unsupervised pre-training for ImageNet classification (73% top-1 linear readoff with a ResNet-50). In addition, transferring our models to PASCAL VOC object detection and COCO instance segmentation consistently outperforms supervised pre-training. Code:http://github.com/HobbitLong/PyContrast

READ FULL TEXT
research
08/07/2023

Feature-Suppressed Contrast for Self-Supervised Food Pre-training

Most previous approaches for analyzing food images have relied on extens...
research
08/31/2021

ScatSimCLR: self-supervised contrastive learning with pretext task regularization for small-scale datasets

In this paper, we consider a problem of self-supervised learning for sma...
research
05/30/2019

Unsupervised pre-training helps to conserve views from input distribution

We investigate the effects of the unsupervised pre-training method under...
research
06/09/2022

Rethinking 360° Image Visual Attention Modelling with Unsupervised Learning

Despite the success of self-supervised representation learning on plana...
research
06/13/2019

Contrastive Multiview Coding

Humans view the world through many sensory channels, e.g., the long-wave...
research
08/05/2021

A Low Rank Promoting Prior for Unsupervised Contrastive Learning

Unsupervised learning is just at a tipping point where it could really t...
research
02/09/2021

DetCo: Unsupervised Contrastive Learning for Object Detection

Unsupervised contrastive learning achieves great success in learning ima...

Please sign up or login with your details

Forgot password? Click here to reset