Compositional Attention Networks for Machine Reasoning

03/08/2018
by   Drew A. Hudson, et al.
0

We present the MAC network, a novel fully differentiable neural network architecture, designed to facilitate explicit and expressive reasoning. Drawing inspiration from first principles of computer organization, MAC moves away from monolithic black-box neural architectures towards a design that encourages both transparency and versatility. The model approaches problems by decomposing them into a series of attention-based reasoning steps, each performed by a novel recurrent Memory, Attention, and Composition (MAC) cell that maintains a separation between control and memory. By stringing the cells together and imposing structural constraints that regulate their interaction, MAC effectively learns to perform iterative reasoning processes that are directly inferred from the data in an end-to-end approach. We demonstrate the model's strength, robustness and interpretability on the challenging CLEVR dataset for visual reasoning, achieving a new state-of-the-art 98.9 error rate of the previous best model. More importantly, we show that the model is computationally-efficient and data-efficient, in particular requiring 5x less data than existing models to achieve strong results.

READ FULL TEXT

page 9

page 10

page 17

page 18

page 19

research
10/03/2022

Extending Compositional Attention Networks for Social Reasoning in Videos

We propose a novel deep architecture for the task of reasoning about soc...
research
03/23/2020

Linguistically Driven Graph Capsule Network for Visual Question Reasoning

Recently, studies of visual question answering have explored various arc...
research
04/27/2020

Differentiable Adaptive Computation Time for Visual Reasoning

This paper presents a novel attention-based algorithm for achieving adap...
research
05/10/2017

Inferring and Executing Programs for Visual Reasoning

Existing methods for visual reasoning attempt to directly map inputs to ...
research
10/03/2020

Towards Interpretable Reasoning over Paragraph Effects in Situation

We focus on the task of reasoning over paragraph effects in situation, w...
research
05/28/2019

Learning Dynamics of Attention: Human Prior for Interpretable Machine Reasoning

Without relevant human priors, neural networks may learn uninterpretable...
research
11/16/2020

Learning Associative Inference Using Fast Weight Memory

Humans can quickly associate stimuli to solve problems in novel contexts...

Please sign up or login with your details

Forgot password? Click here to reset