Incorporating Connections Beyond Knowledge Embeddings: A Plug-and-Play Module to Enhance Commonsense Reasoning in Machine Reading Comprehension

03/26/2021
by   Damai Dai, et al.
0

Conventional Machine Reading Comprehension (MRC) has been well-addressed by pattern matching, but the ability of commonsense reasoning remains a gap between humans and machines. Previous methods tackle this problem by enriching word representations via pre-trained Knowledge Graph Embeddings (KGE). However, they make limited use of a large number of connections between nodes in Knowledge Graphs (KG), which could be pivotal cues to build the commonsense reasoning chains. In this paper, we propose a Plug-and-play module to IncorporatE Connection information for commonsEnse Reasoning (PIECER). Beyond enriching word representations with knowledge embeddings, PIECER constructs a joint query-passage graph to explicitly guide commonsense reasoning by the knowledge-oriented connections between words. Further, PIECER has high generalizability since it can be plugged into suitable positions in any MRC model. Experimental results on ReCoRD, a large-scale public MRC dataset requiring commonsense reasoning, show that PIECER introduces stable performance improvements for four representative base MRC models, especially in low-resource settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/30/2018

ReCoRD: Bridging the Gap between Human and Machine Commonsense Reading Comprehension

We present a large-scale dataset, ReCoRD, for machine reading comprehens...
research
10/25/2020

Commonsense knowledge adversarial dataset that challenges ELECTRA

Commonsense knowledge is critical in human reading comprehension. While ...
research
05/11/2020

Commonsense Evidence Generation and Injection in Reading Comprehension

Human tackle reading comprehension not only based on the given context i...
research
12/21/2020

CSKG: The CommonSense Knowledge Graph

Sources of commonsense knowledge aim to support applications in natural ...
research
09/20/2021

Commonsense Knowledge in Word Associations and ConceptNet

Humans use countless basic, shared facts about the world to efficiently ...
research
10/12/2018

Building Dynamic Knowledge Graphs from Text using Machine Reading Comprehension

We propose a neural machine-reading model that constructs dynamic knowle...
research
07/19/2023

Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model

Despite the significant progress made by transformer models in machine r...

Please sign up or login with your details

Forgot password? Click here to reset