Research on the application of contrastive learning in multi-label text classification

12/01/2022
by   Nankai Lin, et al.
0

The effective application of contrastive learning technology in natural language processing tasks shows the superiority of contrastive learning in text analysis tasks. How to construct positive and negative samples correctly and reasonably is the core challenge of contrastive learning. Since it is difficult to construct contrastive objects in multi-label multi-classification tasks, there are few contrastive losses for multi-label multi-classification text classification. In this paper, we propose five contrastive losses for multi-label multi-classification tasks. They are Strict Contrastive Loss (SCL), Intra-label Contrastive Loss (ICL), Jaccard Similarity Contrastive Loss (JSCL), and Jaccard Similarity Probability Contrastive Loss (JSPCL) and Stepwise Label Contrastive Loss (SLCL). We explore the effectiveness of contrastive learning for multi-label multi-classification tasks under different strategies, and provide a set of baseline methods for contrastive learning techniques on multi-label classification tasks. We also perform an interpretability analysis of our approach to show how different contrastive learning methods play their roles. The experimental results in this paper demonstrate that our proposed contrastive losses can bring some improvement for multi-label multi-classification tasks. Our work reveal how to "appropriately" change the contrastive way of contrastive learning is the key idea to improve the adaptability of contrastive learning in multi-label multi-classification tasks.

READ FULL TEXT
research
07/24/2021

Multi-Label Image Classification with Contrastive Learning

Recently, as an effective way of learning latent representations, contra...
research
10/11/2022

Improving Dense Contrastive Learning with Dense Negative Pairs

Many contrastive representation learning methods learn a single global r...
research
05/13/2022

Interlock-Free Multi-Aspect Rationalization for Text Classification

Explanation is important for text classification tasks. One prevalent ty...
research
12/02/2021

Gaussian Mixture Variational Autoencoder with Contrastive Learning for Multi-Label Classification

Multi-label classification (MLC) is a prediction task where each sample ...
research
05/21/2023

F-PABEE: Flexible-patience-based Early Exiting for Single-label and Multi-label text Classification Tasks

Computational complexity and overthinking problems have become the bottl...
research
11/08/2022

ConsPrompt: Easily Exploiting Contrastive Samples for Few-shot Prompt Learning

Prompt learning recently become an effective linguistic tool to motivate...
research
07/08/2023

End-to-End Supervised Multilabel Contrastive Learning

Multilabel representation learning is recognized as a challenging proble...

Please sign up or login with your details

Forgot password? Click here to reset