Adaptive Memory Networks

02/01/2018
by   Daniel Li, et al.
0

We present Adaptive Memory Networks (AMN) that processes input-question pairs to dynamically construct a network architecture optimized for lower inference times for Question Answering (QA) tasks. AMN processes the input story to extract entities and stores them in memory banks. Starting from a single bank, as the number of input entities increases, AMN learns to create new banks as the entropy in a single bank becomes too high. Hence, after processing an input-question(s) pair, the resulting network represents a hierarchical structure where entities are stored in different banks, distanced by question relevance. At inference, one or few banks are used, creating a tradeoff between accuracy and performance. AMN is enabled by dynamic networks that allow input dependent network creation and efficiency in dynamic mini-batching as well as our novel bank controller that allows learning discrete decision making with high accuracy. In our results, we demonstrate that AMN learns to create variable depth networks depending on task complexity and reduces inference times for QA tasks.

READ FULL TEXT
research
06/24/2015

Ask Me Anything: Dynamic Memory Networks for Natural Language Processing

Most tasks in natural language processing can be cast into question answ...
research
07/20/2018

Question-Aware Sentence Gating Networks for Question and Answering

Machine comprehension question answering, which finds an answer to the q...
research
05/25/2018

Think Visually: Question Answering through Virtual Imagery

In this paper, we study the problem of geometric reasoning in the contex...
research
03/29/2018

Motion-Appearance Co-Memory Networks for Video Question Answering

Video Question Answering (QA) is an important task in understanding vide...
research
04/18/2019

Progressive Attention Memory Network for Movie Story Question Answering

This paper proposes the progressive attention memory network (PAMN) for ...
research
03/11/2017

Ask Me Even More: Dynamic Memory Tensor Networks (Extended Model)

We examine Memory Networks for the task of question answering (QA), unde...
research
01/24/2021

A2P-MANN: Adaptive Attention Inference Hops Pruned Memory-Augmented Neural Networks

In this work, to limit the number of required attention inference hops i...

Please sign up or login with your details

Forgot password? Click here to reset