Tradeoffs Between Contrastive and Supervised Learning: An Empirical Study

12/10/2021
by   Ananya Karthik, et al.
19

Contrastive learning has made considerable progress in computer vision, outperforming supervised pretraining on a range of downstream datasets. However, is contrastive learning the better choice in all situations? We demonstrate two cases where it is not. First, under sufficiently small pretraining budgets, supervised pretraining on ImageNet consistently outperforms a comparable contrastive model on eight diverse image classification datasets. This suggests that the common practice of comparing pretraining approaches at hundreds or thousands of epochs may not produce actionable insights for those with more limited compute budgets. Second, even with larger pretraining budgets we identify tasks where supervised learning prevails, perhaps because the object-centric bias of supervised pretraining makes the model more resilient to common corruptions and spurious foreground-background correlations. These results underscore the need to characterize tradeoffs of different pretraining objectives across a wider range of contexts and training regimes.

READ FULL TEXT

page 4

page 7

research
05/12/2021

When Does Contrastive Visual Representation Learning Work?

Recent self-supervised representation learning techniques have largely c...
research
08/21/2023

SupEuclid: Extremely Simple, High Quality OoD Detection with Supervised Contrastive Learning and Euclidean Distance

Out-of-Distribution (OoD) detection has developed substantially in the p...
research
12/29/2020

CMV-BERT: Contrastive multi-vocab pretraining of BERT

In this work, we represent CMV-BERT, which improves the pretraining of a...
research
12/09/2022

Contrastive View Design Strategies to Enhance Robustness to Domain Shifts in Downstream Object Detection

Contrastive learning has emerged as a competitive pretraining method for...
research
05/26/2022

Learning to segment with limited annotations: Self-supervised pretraining with regression and contrastive loss in MRI

Obtaining manual annotations for large datasets for supervised training ...
research
12/01/2021

Revisiting the Transferability of Supervised Pretraining: an MLP Perspective

The pretrain-finetune paradigm is a classical pipeline in visual learnin...
research
06/11/2020

What makes instance discrimination good for transfer learning?

Unsupervised visual pretraining based on the instance discrimination pre...

Please sign up or login with your details

Forgot password? Click here to reset