A Broad Study on the Transferability of Visual Representations with Contrastive Learning

03/24/2021
by   Ashraful Islam, et al.
0

Tremendous progress has been made in visual representation learning, notably with the recent success of self-supervised contrastive learning methods. Supervised contrastive learning has also been shown to outperform its cross-entropy counterparts by leveraging labels for choosing where to contrast. However, there has been little work to explore the transfer capability of contrastive learning to a different domain. In this paper, we conduct a comprehensive study on the transferability of learned representations of different contrastive approaches for linear evaluation, full-network transfer, and few-shot recognition on 12 downstream datasets from different domains, and object detection tasks on MSCOCO and VOC0712. The results show that the contrastive approaches learn representations that are easily transferable to a different downstream task. We further observe that the joint objective of self-supervised contrastive loss with cross-entropy/supervised-contrastive loss leads to better transferability of these models over their supervised counterparts. Our analysis reveals that the representations learned from the contrastive approaches contain more low/mid-level semantics than cross-entropy models, which enables them to quickly adapt to a new task. Our codes and models will be publicly available to facilitate future research on transferability of visual representations.

READ FULL TEXT
research
03/25/2021

Contrasting Contrastive Self-Supervised Representation Learning Models

In the past few years, we have witnessed remarkable breakthroughs in sel...
research
04/06/2022

Beyond Separability: Analyzing the Linear Transferability of Contrastive Representations to Related Subpopulations

Contrastive learning is a highly effective method which uses unlabeled d...
research
03/16/2022

Is it all a cluster game? – Exploring Out-of-Distribution Detection based on Clustering in the Embedding Space

It is essential for safety-critical applications of deep neural networks...
research
02/17/2021

Dissecting Supervised Constrastive Learning

Minimizing cross-entropy over the softmax scores of a linear map compose...
research
11/05/2020

Intriguing Properties of Contrastive Losses

Contrastive loss and its variants have become very popular recently for ...
research
12/26/2020

Spatial Contrastive Learning for Few-Shot Classification

Existing few-shot classification methods rely to some degree on the cros...
research
02/12/2022

What Makes Good Contrastive Learning on Small-Scale Wearable-based Tasks?

Self-supervised learning establishes a new paradigm of learning represen...

Please sign up or login with your details

Forgot password? Click here to reset