Improving Dense Contrastive Learning with Dense Negative Pairs

10/11/2022
by   Berk Iskender, et al.
9

Many contrastive representation learning methods learn a single global representation of an entire image. However, dense contrastive representation learning methods such as DenseCL [19] can learn better representations for tasks requiring stronger spatial localization of features, such as multi-label classification, detection, and segmentation. In this work, we study how to improve the quality of the representations learned by DenseCL by modifying the training scheme and objective function, and propose DenseCL++. We also conduct several ablation studies to better understand the effects of: (i) various techniques to form dense negative pairs among augmentations of different images, (ii) cross-view dense negative and positive pairs, and (iii) an auxiliary reconstruction task. Our results show 3.5 over SimCLR [3] and DenseCL in COCO multi-label classification. In COCO and VOC segmentation tasks, we achieve 1.8 respectively.

READ FULL TEXT

page 8

page 9

research
12/01/2022

Research on the application of contrastive learning in multi-label text classification

The effective application of contrastive learning technology in natural ...
research
09/03/2022

Label Structure Preserving Contrastive Embedding for Multi-Label Learning with Missing Labels

Contrastive learning (CL) has shown impressive advances in image represe...
research
03/20/2023

MXM-CLR: A Unified Framework for Contrastive Learning of Multifold Cross-Modal Representations

Multifold observations are common for different data modalities, e.g., a...
research
08/07/2023

Multi-Label Self-Supervised Learning with Scene Images

Self-supervised learning (SSL) methods targeting scene images have seen ...
research
06/23/2023

Patch-Level Contrasting without Patch Correspondence for Accurate and Dense Contrastive Representation Learning

We propose ADCLR: A ccurate and D ense Contrastive Representation Learni...
research
11/26/2020

Beyond Single Instance Multi-view Unsupervised Representation Learning

Recent unsupervised contrastive representation learning follows a Single...
research
05/31/2022

Contrasting quadratic assignments for set-based representation learning

The standard approach to contrastive learning is to maximize the agreeme...

Please sign up or login with your details

Forgot password? Click here to reset