Contrastive Learning with Boosted Memorization

05/25/2022
by   Zhihan Zhou, et al.
0

Self-supervised learning has achieved a great success in the representation learning of visual and textual data. However, the current methods are mainly validated on the well-curated datasets, which do not exhibit the real-world long-tailed distribution. Recent attempts to consider self-supervised long-tailed learning are made by rebalancing in the loss perspective or the model perspective, resembling the paradigms in the supervised long-tailed learning. Nevertheless, without the aid of labels, these explorations have not shown the expected significant promise due to the limitation in tail sample discovery or the heuristic structure design. Different from previous works, we explore this direction from an alternative perspective, i.e., the data perspective, and propose a novel Boosted Contrastive Learning (BCL) method. Specifically, BCL leverages the memorization effect of deep neural networks to automatically drive the information discrepancy of the sample views in contrastive learning, which is more efficient to enhance the long-tailed learning in the label-unaware context. Extensive experiments on a range of benchmark datasets demonstrate the effectiveness of BCL over several state-of-the-art methods. Our code is available at https://github.com/Zhihan-Zhou/Boosted-Contrastive-Learning.

READ FULL TEXT

page 4

page 9

research
06/08/2023

On the Effectiveness of Out-of-Distribution Data in Self-Supervised Long-Tail Learning

Though Self-supervised learning (SSL) has been widely studied as a promi...
research
10/15/2022

Self-supervised Graph Learning for Long-tailed Cognitive Diagnosis

Cognitive diagnosis is a fundamental yet critical research task in the f...
research
03/22/2023

MaskCon: Masked Contrastive Learning for Coarse-Labelled Dataset

Deep learning has achieved great success in recent years with the aid of...
research
01/22/2022

PiCO: Contrastive Label Disambiguation for Partial Label Learning

Partial label learning (PLL) is an important problem that allows each tr...
research
07/26/2021

Parametric Contrastive Learning

In this paper, we propose Parametric Contrastive Learning (PaCo) to tack...
research
03/03/2022

BatchFormer: Learning to Explore Sample Relationships for Robust Representation Learning

Despite the success of deep neural networks, there are still many challe...
research
11/29/2022

Textual Enhanced Contrastive Learning for Solving Math Word Problems

Solving math word problems is the task that analyses the relation of qua...

Please sign up or login with your details

Forgot password? Click here to reset