MERIt: Meta-Path Guided Contrastive Learning for Logical Reasoning

03/01/2022
by   Fangkai Jiao, et al.
0

Logical reasoning is of vital importance to natural language understanding. Previous studies either employ graph-based models to incorporate prior knowledge about logical relations, or introduce symbolic logic into neural models through data augmentation. These methods, however, heavily depend on annotated training data, and thus suffer from over-fitting and poor generalization problems due to the dataset sparsity. To address these two problems, in this paper, we propose MERIt, a MEta-path guided contrastive learning method for logical ReasonIng of text, to perform self-supervised pre-training on abundant unlabeled text data. Two novel strategies serve as indispensable components of our method. In particular, a strategy based on meta-path is devised to discover the logical structure in natural texts, followed by a counterfactual data augmentation strategy to eliminate the information shortcut induced by pre-training. The experimental results on two challenging logical reasoning benchmarks, i.e., ReClor and LogiQA, demonstrate that our method outperforms the SOTA baselines with significant improvements.

READ FULL TEXT
research
12/19/2022

APOLLO: A Simple Approach for Adaptive Pretraining of Language Models for Logical Reasoning

Logical reasoning of text is an important ability that requires understa...
research
05/21/2023

Contrastive Learning with Logic-driven Data Augmentation for Logical Reasoning over Text

Pre-trained large language model (LLM) is under exploration to perform N...
research
01/27/2022

Reasoning Like Program Executors

Reasoning over natural language is a long-standing goal for the research...
research
11/23/2022

Mitigating Data Sparsity for Short Text Topic Modeling by Topic-Semantic Contrastive Learning

To overcome the data sparsity issue in short text topic modeling, existi...
research
04/27/2019

HELP: A Dataset for Identifying Shortcomings of Neural Models in Monotonicity Reasoning

Large crowdsourced datasets are widely used for training and evaluating ...
research
10/16/2020

CoDA: Contrast-enhanced and Diversity-promoting Data Augmentation for Natural Language Understanding

Data augmentation has been demonstrated as an effective strategy for imp...
research
05/21/2021

Fact-driven Logical Reasoning

Logical reasoning, which is closely related to human cognition, is of vi...

Please sign up or login with your details

Forgot password? Click here to reset