LUKE-Graph: A Transformer-based Approach with Gated Relational Graph Attention for Cloze-style Reading Comprehension

03/12/2023
by   Shima Foolad, et al.
0

Incorporating prior knowledge can improve existing pre-training models in cloze-style machine reading and has become a new trend in recent studies. Notably, most of the existing models have integrated external knowledge graphs (KG) and transformer-based models, such as BERT into a unified data structure. However, selecting the most relevant ambiguous entities in KG and extracting the best subgraph remains a challenge. In this paper, we propose the LUKE-Graph, a model that builds a heterogeneous graph based on the intuitive relationships between entities in a document without using any external KG. We then use a Relational Graph Attention (RGAT) network to fuse the graph's reasoning information and the contextual representation encoded by the pre-trained LUKE model. In this way, we can take advantage of LUKE, to derive an entity-aware representation; and a graph model - to exploit relation-aware representation. Moreover, we propose Gated-RGAT by augmenting RGAT with a gating mechanism that regulates the question information for the graph convolution operation. This is very similar to human reasoning processing because they always choose the best entity candidate based on the question information. Experimental results demonstrate that the LUKE-Graph achieves state-of-the-art performance on the ReCoRD dataset with commonsense reasoning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2023

Integrating a Heterogeneous Graph with Entity-aware Self-attention using Relative Position Labels for Reading Comprehension Model

Despite the significant progress made by transformer models in machine r...
research
08/13/2019

Incorporating Relation Knowledge into Commonsense Reading Comprehension with Multi-task Learning

This paper focuses on how to take advantage of external relational knowl...
research
12/09/2020

Fusing Context Into Knowledge Graph for Commonsense Reasoning

Commonsense reasoning requires a model to make presumptions about world ...
research
03/28/2023

Pre-training Transformers for Knowledge Graph Completion

Learning transferable representation of knowledge graphs (KGs) is challe...
research
08/11/2022

Heterogeneous Line Graph Transformer for Math Word Problems

This paper describes the design and implementation of a new machine lear...
research
03/31/2020

Procedural Reading Comprehension with Attribute-Aware Context Flow

Procedural texts often describe processes (e.g., photosynthesis and cook...
research
09/21/2020

"When they say weed causes depression, but it's your fav antidepressant": Knowledge-aware Attention Framework for Relationship Extraction

With the increasing legalization of medical and recreational use of cann...

Please sign up or login with your details

Forgot password? Click here to reset