Pair-Level Supervised Contrastive Learning for Natural Language Inference

01/26/2022
by   Shu'ang Li, et al.
0

Natural language inference (NLI) is an increasingly important task for natural language understanding, which requires one to infer the relationship between the sentence pair (premise and hypothesis). Many recent works have used contrastive learning by incorporating the relationship of the sentence pair from NLI datasets to learn sentence representation. However, these methods only focus on comparisons with sentence-level representations. In this paper, we propose a Pair-level Supervised Contrastive Learning approach (PairSCL). We adopt a cross attention module to learn the joint representations of the sentence pairs. A contrastive learning objective is designed to distinguish the varied classes of sentence pairs by pulling those in one class together and pushing apart the pairs in other classes. We evaluate PairSCL on two public datasets of NLI where the accuracy of PairSCL outperforms other methods by 2.1 on average. Furthermore, our method outperforms the previous state-of-the-art method on seven transfer tasks of text classification.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset