Unbiased and Efficient Self-Supervised Incremental Contrastive Learning

01/28/2023
by   Cheng Ji, et al.
0

Contrastive Learning (CL) has been proved to be a powerful self-supervised approach for a wide range of domains, including computer vision and graph representation learning. However, the incremental learning issue of CL has rarely been studied, which brings the limitation in applying it to real-world applications. Contrastive learning identifies the samples with the negative ones from the noise distribution that changes in the incremental scenarios. Therefore, only fitting the change of data without noise distribution causes bias, and directly retraining results in low efficiency. To bridge this research gap, we propose a self-supervised Incremental Contrastive Learning (ICL) framework consisting of (i) a novel Incremental InfoNCE (NCE-II) loss function by estimating the change of noise distribution for old data to guarantee no bias with respect to the retraining, (ii) a meta-optimization with deep reinforced Learning Rate Learning (LRL) mechanism which can adaptively learn the learning rate according to the status of the training processes and achieve fast convergence which is critical for incremental learning. Theoretically, the proposed ICL is equivalent to retraining, which is based on solid mathematical derivation. In practice, extensive experiments in different domains demonstrate that, without retraining a new model, ICL achieves up to 16.7x training speedup and 16.8x faster convergence with competitive results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/04/2021

Graph Barlow Twins: A self-supervised representation learning framework for graphs

The self-supervised learning (SSL) paradigm is an essential exploration ...
research
10/05/2020

EqCo: Equivalent Rules for Self-supervised Contrastive Learning

In this paper, we propose a method, named EqCo (Equivalent Rules for Con...
research
09/17/2022

Few-Shot Classification with Contrastive Learning

A two-stage training paradigm consisting of sequential pre-training and ...
research
12/10/2021

Concept Representation Learning with Contrastive Self-Supervised Learning

Concept-oriented deep learning (CODL) is a general approach to meet the ...
research
01/27/2023

Bayesian Self-Supervised Contrastive Learning

Recent years have witnessed many successful applications of contrastive ...
research
01/23/2023

Optimizing the Noise in Self-Supervised Learning: from Importance Sampling to Noise-Contrastive Estimation

Self-supervised learning is an increasingly popular approach to unsuperv...
research
02/15/2018

Learning Determinantal Point Processes by Sampling Inferred Negatives

Determinantal Point Processes (DPPs) have attracted significant interest...

Please sign up or login with your details

Forgot password? Click here to reset