Symmetric Regularization based BERT for Pair-wise Semantic Reasoning

by   Xingyi Cheng, et al.

The ability of semantic reasoning over the sentence pair is essential for many natural language understanding tasks, e.g., natural language inference and machine reading comprehension. A recent significant improvement in these tasks comes from BERT. As reported, the next sentence prediction (NSP) in BERT, which learns the contextual relationship between two sentences, is of great significance for downstream problems with sentence-pair input. Despite the effectiveness of NSP, we suggest that NSP still lacks the essential signal to distinguish between entailment and shallow correlation. To remedy this, we propose to augment the NSP task to a 3-class categorization task, which includes a category for previous sentence prediction (PSP). The involvement of PSP encourages the model to focus on the informative semantics to determine the sentence order, thereby improves the ability of semantic understanding. This simple modification yields remarkable improvement against vanilla BERT. To further incorporate the document-level information, the scope of NSP and PSP is expanded into a broader range, i.e., NSP and PSP also include close but nonsuccessive sentences, the noise of which is mitigated by the label-smoothing technique. Both qualitative and quantitative experimental results demonstrate the effectiveness of the proposed method. Our method consistently improves the performance on the NLI and MRC benchmarks, including the challenging HANS dataset hans, suggesting that the document-level task is still promising for the pre-training.


page 5

page 6


Semantics-aware BERT for Language Understanding

The latest work on language representations carefully integrates context...

Semantics-Aware Inferential Network for Natural Language Understanding

For natural language understanding tasks, either machine reading compreh...

R^2F: A General Retrieval, Reading and Fusion Framework for Document-level Natural Language Inference

Document-level natural language inference (DOCNLI) is a new challenging ...

Pairwise Supervised Contrastive Learning of Sentence Representations

Many recent successes in sentence representation learning have been achi...

INSET: Sentence Infilling with Inter-sentential Generative Pre-training

Missing sentence generation (or sentence infilling) fosters a wide range...

SenSeNet: Neural Keyphrase Generation with Document Structure

Keyphrase Generation (KG) is the task of generating central topics from ...

NLITrans at SemEval-2018 Task 12: Transfer of Semantic Knowledge for Argument Comprehension

The Argument Reasoning Comprehension Task requires significant language ...

Please sign up or login with your details

Forgot password? Click here to reset