DeepAI AI Chat
Log In Sign Up

DUEL: Adaptive Duplicate Elimination on Working Memory for Self-Supervised Learning

by   Won-Seok Choi, et al.
Seoul National University

In Self-Supervised Learning (SSL), it is known that frequent occurrences of the collision in which target data and its negative samples share the same class can decrease performance. Especially in real-world data such as crawled data or robot-gathered observations, collisions may occur more often due to the duplicates in the data. To deal with this problem, we claim that sampling negative samples from the adaptively debiased distribution in the memory makes the model more stable than sampling from a biased dataset directly. In this paper, we introduce a novel SSL framework with adaptive Duplicate Elimination (DUEL) inspired by the human working memory. The proposed framework successfully prevents the downstream task performance from degradation due to a dramatic inter-class imbalance.


Evaluation of Out-of-Distribution Detection Performance of Self-Supervised Learning in a Controllable Environment

We evaluate the out-of-distribution (OOD) detection performance of self-...

Self-supervised Learning is More Robust to Dataset Imbalance

Self-supervised learning (SSL) is a scalable way to learn general visual...

Improving Self-supervised Learning with Automated Unsupervised Outlier Arbitration

Our work reveals a structured shortcoming of the existing mainstream sel...

Self-supervised Tumor Segmentation through Layer Decomposition

In this paper, we propose a self-supervised approach for tumor segmentat...

Run Away From your Teacher: Understanding BYOL by a Novel Self-Supervised Approach

Recently, a newly proposed self-supervised framework Bootstrap Your Own ...

A Constant-time Adaptive Negative Sampling

Softmax classifiers with a very large number of classes naturally occur ...