Bidirectional Attention Flow for Machine Comprehension

11/05/2016
by   Minjoon Seo, et al.
0

Machine comprehension (MC), answering a query about a given context paragraph, requires modeling complex interactions between the context and the query. Recently, attention mechanisms have been successfully extended to MC. Typically these methods use attention to focus on a small portion of the context and summarize it with a fixed-size vector, couple attentions temporally, and/or often form a uni-directional attention. In this paper we introduce the Bi-Directional Attention Flow (BIDAF) network, a multi-stage hierarchical process that represents the context at different levels of granularity and uses bi-directional attention flow mechanism to obtain a query-aware context representation without early summarization. Our experimental evaluations show that our model achieves the state-of-the-art results in Stanford Question Answering Dataset (SQuAD) and CNN/DailyMail cloze test.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2017

Reinforced Mnemonic Reader for Machine Comprehension

In this paper, we introduce the Reinforced Mnemonic Reader for machine c...
research
03/25/2018

Pay More Attention - Neural Architectures for Question-Answering

Machine comprehension is a representative task of natural language under...
research
04/24/2017

Ruminating Reader: Reasoning with Gated Multi-Hop Attention

To answer the question in machine comprehension (MC) task, the models ne...
research
12/13/2016

Multi-Perspective Context Matching for Machine Comprehension

Previous machine comprehension (MC) datasets are either too small to tra...
research
04/10/2019

BAG: Bi-directional Attention Entity Graph Convolutional Network for Multi-hop Reasoning Question Answering

Multi-hop reasoning question answering requires deep comprehension of re...
research
07/28/2017

MEMEN: Multi-layer Embedding with Memory Networks for Machine Comprehension

Machine comprehension(MC) style question answering is a representative p...
research
05/02/2019

Conditioning LSTM Decoder and Bi-directional Attention Based Question Answering System

Applying neural-networks on Question Answering has gained increasing pop...

Please sign up or login with your details

Forgot password? Click here to reset