Label Structure Preserving Contrastive Embedding for Multi-Label Learning with Missing Labels

09/03/2022
by   Zhongchen Ma, et al.
0

Contrastive learning (CL) has shown impressive advances in image representation learning in whichever supervised multi-class classification or unsupervised learning. However, these CL methods fail to be directly adapted to multi-label image classification due to the difficulty in defining the positive and negative instances to contrast a given anchor image in multi-label scenario, let the label missing one alone, implying that borrowing a commonly-used way from contrastive multi-class learning to define them will incur a lot of false negative instances unfavorable for learning. In this paper, with the introduction of a label correction mechanism to identify missing labels, we first elegantly generate positives and negatives for individual semantic labels of an anchor image, then define a unique contrastive loss for multi-label image classification with missing labels (CLML), the loss is able to accurately bring images close to their true positive images and false negative images, far away from their true negative images. Different from existing multi-label CL losses, CLML also preserves low-rank global and local label dependencies in the latent representation space where such dependencies have been shown to be helpful in dealing with missing labels. To the best of our knowledge, this is the first general multi-label CL loss in the missing-label scenario and thus can seamlessly be paired with those losses of any existing multi-label learning methods just via a single hyperparameter. The proposed strategy has been shown to improve the classification performance of the Resnet101 model by margins of 1.2 standard datasets, MSCOCO, VOC, and NUS-WIDE. Code is available at https://github.com/chuangua/ContrastiveLossMLML.

READ FULL TEXT

page 1

page 8

research
06/17/2021

Multi-Label Learning from Single Positive Labels

Predicting all applicable labels for a given image is known as multi-lab...
research
07/08/2023

End-to-End Supervised Multilabel Contrastive Learning

Multilabel representation learning is recognized as a challenging proble...
research
04/27/2022

Use All The Labels: A Hierarchical Multi-Label Contrastive Learning Framework

Current contrastive learning frameworks focus on leveraging a single sup...
research
10/11/2022

Improving Dense Contrastive Learning with Dense Negative Pairs

Many contrastive representation learning methods learn a single global r...
research
12/13/2021

Simple and Robust Loss Design for Multi-Label Learning with Missing Labels

Multi-label learning in the presence of missing labels (MLML) is a chall...
research
07/20/2020

Multi-label Contrastive Predictive Coding

Variational mutual information (MI) estimators are widely used in unsupe...
research
09/13/2022

Class-Level Logit Perturbation

Features, logits, and labels are the three primary data when a sample pa...

Please sign up or login with your details

Forgot password? Click here to reset