A Multi-level Supervised Contrastive Learning Framework for Low-Resource Natural Language Inference

05/31/2022
by   Shu'ang Li, et al.
9

Natural Language Inference (NLI) is a growingly essential task in natural language understanding, which requires inferring the relationship between the sentence pairs (premise and hypothesis). Recently, low-resource natural language inference has gained increasing attention, due to significant savings in manual annotation costs and a better fit with real-world scenarios. Existing works fail to characterize discriminative representations between different classes with limited training data, which may cause faults in label prediction. Here we propose a multi-level supervised contrastive learning framework named MultiSCL for low-resource natural language inference. MultiSCL leverages a sentence-level and pair-level contrastive learning objective to discriminate between different classes of sentence pairs by bringing those in one class together and pushing away those in different classes. MultiSCL adopts a data augmentation module that generates different views for input samples to better learn the latent representation. The pair-level representation is obtained from a cross attention module. We conduct extensive experiments on two public NLI datasets in low-resource settings, and the accuracy of MultiSCL exceeds other models by 3.1 state-of-the-art method on cross-domain tasks of text classification.

READ FULL TEXT

page 1

page 4

page 8

page 13

research
01/26/2022

Pair-Level Supervised Contrastive Learning for Natural Language Inference

Natural language inference (NLI) is an increasingly important task for n...
research
10/16/2020

CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding

Data augmentation has been demonstrated as an effective strategy for imp...
research
04/18/2022

Detect Rumors in Microblog Posts for Low-Resource Domains via Adversarial Contrastive Learning

Massive false rumors emerging along with breaking news or trending topic...
research
12/20/2022

CoCo: Coherence-Enhanced Machine-Generated Text Detection Under Data Limitation With Contrastive Learning

Machine-Generated Text (MGT) detection, a task that discriminates MGT fr...
research
01/21/2022

Dual Contrastive Learning: Text Classification via Label-Aware Data Augmentation

Contrastive learning has achieved remarkable success in representation l...
research
04/04/2023

A Unified Contrastive Transfer Framework with Propagation Structure for Boosting Low-Resource Rumor Detection

The truth is significantly hampered by massive rumors that spread along ...
research
11/12/2021

Exploiting all samples in low-resource sentence classification: early stopping and initialization parameters

In low resource settings, deep neural models have often shown lower perf...

Please sign up or login with your details

Forgot password? Click here to reset