Sequential Attention: A Context-Aware Alignment Function for Machine Reading

05/05/2017
by   Sebastian Brarda, et al.
0

In this paper we propose a neural network model with a novel Sequential Attention layer that extends soft attention by assigning weights to words in an input sequence in a way that takes into account not just how well that word matches a query, but how well surrounding words match. We evaluate this approach on the task of reading comprehension (on the Who did What and CNN datasets) and show that it dramatically improves a strong baseline--the Stanford Reader--and is competitive with the state of the art.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2016

Consensus Attention-based Neural Networks for Chinese Reading Comprehension

Reading comprehension has embraced a booming in recent NLP research. Sev...
research
07/15/2016

Attention-over-Attention Neural Networks for Reading Comprehension

Cloze-style queries are representative problems in reading comprehension...
research
12/10/2017

Contextualized Word Representations for Reading Comprehension

Reading a document and extracting an answer to a question about its cont...
research
07/13/2021

Deep Neural Networks Evolve Human-like Attention Distribution during Reading Comprehension

Attention is a key mechanism for information selection in both biologica...
research
08/07/2018

Effective Character-augmented Word Embedding for Machine Reading Comprehension

Machine reading comprehension is a task to model relationship between pa...
research
06/24/2018

Subword-augmented Embedding for Cloze Reading Comprehension

Representation learning is the foundation of machine reading comprehensi...
research
06/07/2016

Iterative Alternating Neural Attention for Machine Reading

We propose a novel neural attention architecture to tackle machine compr...

Please sign up or login with your details

Forgot password? Click here to reset