Rebalanced Siamese Contrastive Mining for Long-Tailed Recognition

03/22/2022
โˆ™
by   Zhisheng Zhong, et al.
โˆ™
0
โˆ™

Deep neural networks perform poorly on heavily class-imbalanced datasets. Given the promising performance of contrastive learning, we propose ๐‘๐žbalanced ๐’iamese ๐‚๐จntrastive ๐ฆining ( ๐‘๐ž๐ฌ๐‚๐จ๐ฆ) to tackle imbalanced recognition. Based on the mathematical analysis and simulation results, we claim that supervised contrastive learning suffers a dual class-imbalance problem at both the original batch and Siamese batch levels, which is more serious than long-tailed classification learning. In this paper, at the original batch level, we introduce a class-balanced supervised contrastive loss to assign adaptive weights for different classes. At the Siamese batch level, we present a class-balanced queue, which maintains the same number of keys for all classes. Furthermore, we note that the contrastive loss gradient with respect to the contrastive logits can be decoupled into the positives and negatives, and easy positives and easy negatives will make the contrastive gradient vanish. We propose supervised hard positive and negative pairs mining to pick up informative pairs for contrastive computation and improve representation learning. Finally, to approximately maximize the mutual information between the two views, we propose Siamese Balanced Softmax and joint it with the contrastive loss for one-stage training. ResCom outperforms the previous methods by large margins on multiple long-tailed recognition benchmarks. Our code will be made publicly available at: https://github.com/dvlab-research/ResCom.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
โˆ™ 11/22/2022

Supervised Contrastive Learning on Blended Images for Long-tailed Recognition

Real-world data often have a long-tailed distribution, where the number ...
research
โˆ™ 05/17/2023

Infinite Class Mixup

Mixup is a widely adopted strategy for training deep networks, where add...
research
โˆ™ 07/26/2021

Parametric Contrastive Learning

In this paper, we propose Parametric Contrastive Learning (PaCo) to tack...
research
โˆ™ 07/14/2022

An Asymmetric Contrastive Loss for Handling Imbalanced Datasets

Contrastive learning is a representation learning method performed by co...
research
โˆ™ 09/30/2020

Joint Contrastive Learning with Infinite Possibilities

This paper explores useful modifications of the recent development in co...
research
โˆ™ 09/26/2022

Generalized Parametric Contrastive Learning

In this paper, we propose the Generalized Parametric Contrastive Learnin...
research
โˆ™ 03/03/2022

BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning

Despite the success of deep neural networks, there are still many challe...

Please sign up or login with your details

Forgot password? Click here to reset