DimCL: Dimensional Contrastive Learning For Improving Self-Supervised Learning

09/21/2023
by   Thanh Nguyen, et al.
0

Self-supervised learning (SSL) has gained remarkable success, for which contrastive learning (CL) plays a key role. However, the recent development of new non-CL frameworks has achieved comparable or better performance with high improvement potential, prompting researchers to enhance these frameworks further. Assimilating CL into non-CL frameworks has been thought to be beneficial, but empirical evidence indicates no visible improvements. In view of that, this paper proposes a strategy of performing CL along the dimensional direction instead of along the batch direction as done in conventional contrastive learning, named Dimensional Contrastive Learning (DimCL). DimCL aims to enhance the feature diversity, and it can serve as a regularizer to prior SSL frameworks. DimCL has been found to be effective, and the hardness-aware property is identified as a critical reason for its success. Extensive experimental results reveal that assimilating DimCL into SSL frameworks leads to performance improvement by a non-trivial margin on various datasets and backbone architectures.

READ FULL TEXT

page 1

page 2

page 3

page 6

page 7

page 8

page 10

research
12/22/2022

Understanding and Improving the Role of Projection Head in Self-Supervised Learning

Self-supervised learning (SSL) aims to produce useful feature representa...
research
06/06/2021

Understand and Improve Contrastive Learning Methods for Visual Representation: A Review

Traditional supervised learning methods are hitting a bottleneck because...
research
06/20/2023

Understanding Contrastive Learning Through the Lens of Margins

Self-supervised learning, or SSL, holds the key to expanding the usage o...
research
03/10/2023

Improving Domain-Invariance in Self-Supervised Learning via Batch Styles Standardization

The recent rise of Self-Supervised Learning (SSL) as one of the preferre...
research
06/26/2022

Latent Augmentation For Better Graph Self-Supervised Learning

Graph self-supervised learning has been vastly employed to learn represe...
research
06/03/2022

On the duality between contrastive and non-contrastive self-supervised learning

Recent approaches in self-supervised learning of image representations c...
research
03/30/2022

Dual Temperature Helps Contrastive Learning Without Many Negative Samples: Towards Understanding and Simplifying MoCo

Contrastive learning (CL) is widely known to require many negative sampl...

Please sign up or login with your details

Forgot password? Click here to reset