Iterative Alternating Neural Attention for Machine Reading

06/07/2016
by   Alessandro Sordoni, et al.
0

We propose a novel neural attention architecture to tackle machine comprehension tasks, such as answering Cloze-style queries with respect to a document. Unlike previous models, we do not collapse the query into a single vector, instead we deploy an iterative alternating attention mechanism that allows a fine-grained exploration of both the query and the document. Our model outperforms state-of-the-art baselines in standard machine comprehension benchmarks such as CNN news articles and the Children's Book Test (CBT) dataset.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset