End-to-End Supervised Multilabel Contrastive Learning

07/08/2023
by   Ahmad Sajedi, et al.
0

Multilabel representation learning is recognized as a challenging problem that can be associated with either label dependencies between object categories or data-related issues such as the inherent imbalance of positive/negative samples. Recent advances address these challenges from model- and data-centric viewpoints. In model-centric, the label correlation is obtained by an external model designs (e.g., graph CNN) to incorporate an inductive bias for training. However, they fail to design an end-to-end training framework, leading to high computational complexity. On the contrary, in data-centric, the realistic nature of the dataset is considered for improving the classification while ignoring the label dependencies. In this paper, we propose a new end-to-end training framework – dubbed KMCL (Kernel-based Mutlilabel Contrastive Learning) – to address the shortcomings of both model- and data-centric designs. The KMCL first transforms the embedded features into a mixture of exponential kernels in Gaussian RKHS. It is then followed by encoding an objective loss that is comprised of (a) reconstruction loss to reconstruct kernel representation, (b) asymmetric classification loss to address the inherent imbalance problem, and (c) contrastive loss to capture label correlation. The KMCL models the uncertainty of the feature encoder while maintaining a low computational footprint. Extensive experiments are conducted on image classification tasks to showcase the consistent improvements of KMCL over the SOTA methods. PyTorch implementation is provided in <https://github.com/mahdihosseini/KMCL>.

READ FULL TEXT

page 9

page 17

research
09/03/2022

Label Structure Preserving Contrastive Embedding for Multi-Label Learning with Missing Labels

Contrastive learning (CL) has shown impressive advances in image represe...
research
12/01/2022

Research on the application of contrastive learning in multi-label text classification

The effective application of contrastive learning technology in natural ...
research
03/27/2022

CaCo: Both Positive and Negative Samples are Directly Learnable via Cooperative-adversarial Contrastive Learning

As a representative self-supervised method, contrastive learning has ach...
research
09/29/2020

Asymmetric Loss For Multi-Label Classification

Pictures of everyday life are inherently multi-label in nature. Hence, m...
research
01/07/2022

Equalized Focal Loss for Dense Long-Tailed Object Detection

Despite the recent success of long-tailed object detection, almost all l...
research
04/10/2023

Asymmetric Polynomial Loss For Multi-Label Classification

Various tasks are reformulated as multi-label classification problems, i...
research
06/26/2020

End-to-end training of deep kernel map networks for image classification

Deep kernel map networks have shown excellent performances in various cl...

Please sign up or login with your details

Forgot password? Click here to reset