Hierarchical Memory Networks for Answer Selection on Unknown Words

09/28/2016
by   Jiaming Xu, et al.
0

Recently, end-to-end memory networks have shown promising results on Question Answering task, which encode the past facts into an explicit memory and perform reasoning ability by making multiple computational steps on the memory. However, memory networks conduct the reasoning on sentence-level memory to output coarse semantic vectors and do not further take any attention mechanism to focus on words, which may lead to the model lose some detail information, especially when the answers are rare or unknown words. In this paper, we propose a novel Hierarchical Memory Networks, dubbed HMN. First, we encode the past facts into sentence-level memory and word-level memory respectively. Then, (k)-max pooling is exploited following reasoning module on the sentence-level memory to sample the (k) most relevant sentences to a question and feed these sentences into attention mechanism on the word-level memory to focus the words in the selected sentences. Finally, the prediction is jointly learned over the outputs of the sentence-level reasoning module and the word-level attention mechanism. The experimental results demonstrate that our approach successfully conducts answer selection on unknown words and achieves a better performance than memory networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/25/2018

A Question-Focused Multi-Factor Attention Network for Question Answering

Neural network models recently proposed for question answering (QA) prim...
research
09/05/2019

A Better Way to Attend: Attention with Trees for Video Question Answering

We propose a new attention model for video question answering. The main ...
research
09/27/2021

Recall and Learn: A Memory-augmented Solver for Math Word Problems

In this article, we tackle the math word problem, namely, automatically ...
research
07/20/2018

Question-Aware Sentence Gating Networks for Question and Answering

Machine comprehension question answering, which finds an answer to the q...
research
09/22/2021

A Simple Approach to Jointly Rank Passages and Select Relevant Sentences in the OBQA Context

In the open question answering (OBQA) task, how to select the relevant i...
research
06/13/2018

Generating Sentences Using a Dynamic Canvas

We introduce the Attentive Unsupervised Text (W)riter (AUTR), which is a...
research
11/01/2020

Seeing Both the Forest and the Trees: Multi-head Attention for Joint Classification on Different Compositional Levels

In natural languages, words are used in association to construct sentenc...

Please sign up or login with your details

Forgot password? Click here to reset