When hard negative sampling meets supervised contrastive learning

08/28/2023
by   Zijun Long, et al.
0

State-of-the-art image models predominantly follow a two-stage strategy: pre-training on large datasets and fine-tuning with cross-entropy loss. Many studies have shown that using cross-entropy can result in sub-optimal generalisation and stability. While the supervised contrastive loss addresses some limitations of cross-entropy loss by focusing on intra-class similarities and inter-class differences, it neglects the importance of hard negative mining. We propose that models will benefit from performance improvement by weighting negative samples based on their dissimilarity to positive counterparts. In this paper, we introduce a new supervised contrastive learning objective, SCHaNe, which incorporates hard negative sampling during the fine-tuning phase. Without requiring specialized architectures, additional data, or extra computational resources, experimental results indicate that SCHaNe outperforms the strong baseline BEiT-3 in Top-1 accuracy across various benchmarks, with significant gains of up to 3.32% in few-shot learning settings and 3.41% in full dataset fine-tuning. Importantly, our proposed objective sets a new state-of-the-art for base models on ImageNet-1k, achieving an 86.14% accuracy. Furthermore, we demonstrate that the proposed objective yields better embeddings and explains the improved effectiveness observed in our experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/03/2020

Supervised Contrastive Learning for Pre-trained Language Model Fine-tuning

State-of-the-art natural language understanding classification models fo...
research
10/27/2022

Dictionary-Assisted Supervised Contrastive Learning

Text analysis in the social sciences often involves using specialized di...
research
04/21/2021

Sparse-Shot Learning for Extremely Many Localisations

Object localisation is typically considered in the context of regular im...
research
06/15/2022

Differentiable Top-k Classification Learning

The top-k classification accuracy is one of the core metrics in machine ...
research
09/04/2020

Class Interference Regularization

Contrastive losses yield state-of-the-art performance for person re-iden...
research
04/13/2021

Understanding Hard Negatives in Noise Contrastive Estimation

The choice of negative examples is important in noise contrastive estima...
research
04/07/2021

Bootstrapping Your Own Positive Sample: Contrastive Learning With Electronic Health Record Data

Electronic Health Record (EHR) data has been of tremendous utility in Ar...

Please sign up or login with your details

Forgot password? Click here to reset