Iterative Alternating Neural Attention for Machine Reading

06/07/2016
by   Alessandro Sordoni, et al.
0

We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2016

Attention-over-Attention Neural Networks for Reading Comprehension

Cloze-style queries are representative problems in reading comprehension...
research
05/26/2018

Dependent Gated Reading for Cloze-Style Question Answering

We present a novel deep learning architecture to address the cloze-style...
research
03/04/2016

Text Understanding with the Attention Sum Reader Network

Several large cloze-style context-question-answer datasets have been int...
research
06/05/2016

Gated-Attention Readers for Text Comprehension

In this paper we study the problem of answering cloze-style questions ov...
research
10/06/2019

Fine-Grained Analysis of Propaganda in News Articles

Propaganda aims at influencing people's mindset with the purpose of adva...
research
11/10/2018

Densely Connected Attention Propagation for Reading Comprehension

We propose DecaProp (Densely Connected Attention Propagation), a new den...
research
05/05/2017

Sequential Attention: A Context-Aware Alignment Function for Machine Reading

In this paper we propose a neural network model with a novel Sequential ...

Please sign up or login with your details

Forgot password? Click here to reset