Model-Aware Contrastive Learning: Towards Escaping Uniformity-Tolerance Dilemma in Training

07/16/2022
by   Zizheng Huang, et al.
0

Instance discrimination contrastive learning (CL) has achieved significant success in learning transferable representations. A hardness-aware property related to the temperature τ of the CL loss is identified to play an essential role in automatically concentrating on hard negative samples. However, previous work also proves that there exists a uniformity-tolerance dilemma (UTD) in CL loss, which will lead to unexpected performance degradation. Specifically, a smaller temperature helps to learn separable embeddings but has less tolerance to semantically related samples, which may result in suboptimal embedding space, and vice versa. In this paper, we propose a Model-Aware Contrastive Learning (MACL) strategy to escape UTD. For the undertrained phases, there is less possibility that the high similarity region of the anchor contains latent positive samples. Thus, adopting a small temperature in these stages can impose larger penalty strength on hard negative samples to improve the discrimination of the CL model. In contrast, a larger temperature in the well-trained phases helps to explore semantic structures due to more tolerance to potential positive samples. During implementation, the temperature in MACL is designed to be adaptive to the alignment property that reflects the confidence of a CL model. Furthermore, we reexamine why contrastive learning requires a large number of negative samples in a unified gradient reduction perspective. Based on MACL and these analyses, a new CL loss is proposed in this work to improve the learned representations and training with small batch size.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/15/2020

Understanding the Behaviour of Contrastive Loss

Unsupervised contrastive learning has achieved outstanding success, whil...
research
03/30/2022

Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

Contrastive learning (CL) is widely known to require many negative sampl...
research
04/06/2023

Synthetic Hard Negative Samples for Contrastive Learning

Contrastive learning has emerged as an essential approach for self-super...
research
12/16/2022

Hard Sample Aware Network for Contrastive Deep Graph Clustering

Contrastive deep graph clustering, which aims to divide nodes into disjo...
research
05/19/2023

Not All Semantics are Created Equal: Contrastive Self-supervised Learning with Automatic Temperature Individualization

In this paper, we aim to optimize a contrastive loss with individualized...
research
09/09/2021

Smoothed Contrastive Learning for Unsupervised Sentence Embedding

Contrastive learning has been gradually applied to learn high-quality un...
research
04/22/2022

Universum-inspired Supervised Contrastive Learning

Mixup is an efficient data augmentation method which generates additiona...

Please sign up or login with your details

Forgot password? Click here to reset