Multi-granularity hierarchical attention fusion networks for reading comprehension and question answering

11/29/2018
by   Wei Wang, et al.
0

This paper describes a novel hierarchical attention network for reading comprehension style question answering, which aims to answer questions for a given narrative paragraph. In the proposed method, attention and fusion are conducted horizontally and vertically across layers at different levels of granularity between question and paragraph. Specifically, it first encode the question and paragraph with fine-grained language embeddings, to better capture the respective representations at semantic level. Then it proposes a multi-granularity fusion approach to fully fuse information from both global and attended representations. Finally, it introduces a hierarchical attention network to focuses on the answer span progressively with multi-level softalignment. Extensive experiments on the large-scale SQuAD and TriviaQA datasets validate the effectiveness of the proposed method. At the time of writing the paper (Jan. 12th 2018), our model achieves the first position on the SQuAD leaderboard for both single and ensemble models. We also achieves state-of-the-art results on TriviaQA, AddSent and AddOne-Sent datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/31/2019

QAInfomax: Learning Robust Question Answering System by Mutual Information Maximization

Standard accuracy metrics indicate that modern reading comprehension sys...
research
11/16/2017

FusionNet: Fusing via Fully-Aware Attention with Application to Machine Comprehension

This paper introduces a new neural structure called FusionNet, which ext...
research
09/05/2022

Continuous Decomposition of Granularity for Neural Paraphrase Generation

While Transformers have had significant success in paragraph generation,...
research
12/20/2020

Adaptive Bi-directional Attention: Exploring Multi-Granularity Representations for Machine Reading Comprehension

Recently, the attention-enhanced multi-layer encoder, such as Transforme...
research
09/25/2020

No Answer is Better Than Wrong Answer: A Reflection Model for Document Level Machine Reading Comprehension

The Natural Questions (NQ) benchmark set brings new challenges to Machin...
research
01/08/2019

Multi-Perspective Fusion Network for Commonsense Reading Comprehension

Commonsense Reading Comprehension (CRC) is a significantly challenging t...
research
03/08/2021

MCR-Net: A Multi-Step Co-Interactive Relation Network for Unanswerable Questions on Machine Reading Comprehension

Question answering systems usually use keyword searches to retrieve pote...

Please sign up or login with your details

Forgot password? Click here to reset