SG-Net: Syntax-Guided Machine Reading Comprehension

08/14/2019
by   Zhuosheng Zhang, et al.
0

For machine reading comprehension, how to effectively model the linguistic knowledge from the detail-riddled and lengthy passages and get ride of the noises is essential to improve its performance. In this work, we propose using syntax to guide the text modeling of both passages and questions by incorporating syntactic clues into multi-head attention mechanism to fully fuse information from both global and attended representations. Accordingly, we present a novel syntax-guided network (SG-Net) for challenging reading comprehension tasks. Extensive experiments on popular benchmarks including SQuAD 2.0 and RACE validate the effectiveness of the proposed method with substantial improvements over fine-tuned BERT. This work empirically discloses the effectiveness of syntactic structural information for text modeling. The proposed attention mechanism also verifies the practicability of using linguistic information to guide attention learning and can be easily adapted with other tree-structured annotations.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
12/27/2020

SG-Net: Syntax Guided Transformer for Language Representation

Understanding human language is one of the key themes of artificial inte...
research
08/05/2019

Beyond English-only Reading Comprehension: Experiments in Zero-Shot Multilingual Transfer for Bulgarian

Recently, reading comprehension models achieved near-human performance o...
research
05/07/2021

VAULT: VAriable Unified Long Text Representation for Machine Reading Comprehension

Existing models on Machine Reading Comprehension (MRC) require complex m...
research
04/21/2019

Probing Prior Knowledge Needed in Challenging Chinese Machine Reading Comprehension

With an ultimate goal of narrowing the gap between human and machine rea...
research
11/21/2018

Convolutional Spatial Attention Model for Reading Comprehension with Multiple-Choice Questions

Machine Reading Comprehension (MRC) with multiple-choice questions requi...
research
10/28/2019

What does BERT Learn from Multiple-Choice Reading Comprehension Datasets?

Multiple-Choice Reading Comprehension (MCRC) requires the model to read ...
research
08/28/2019

Discourse-Aware Semantic Self-Attention for Narrative Reading Comprehension

In this work, we propose to use linguistic annotations as a basis for a ...

Please sign up or login with your details

Forgot password? Click here to reset