Can Active Memory Replace Attention?

10/27/2016
by   Łukasz Kaiser, et al.
0

Several mechanisms to focus attention of a neural network on selected parts of its input or memory have been used successfully in deep learning models in recent years. Attention has improved image classification, image captioning, speech recognition, generative models, and learning algorithmic tasks, but it had probably the largest impact on neural machine translation. Recently, similar improvements have been obtained using alternative mechanisms that do not focus on a single part of a memory but operate on all of it in parallel, in a uniform way. Such mechanism, which we call active memory, improved over attention in algorithmic tasks, image processing, and in generative modelling. So far, however, active memory has not improved over attention for most natural language processing tasks, in particular for machine translation. We analyze this shortcoming in this paper and propose an extended model of active memory that matches existing attention models on neural machine translation and generalizes better to longer sentences. We investigate this model and explain why previous active memory models did not succeed. Finally, we discuss when active memory brings most benefits and where attention can be a better choice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/09/2018

Image Captioning as Neural Machine Translation Task in SOCKEYE

Image captioning is an interdisciplinary research problem that stands be...
research
10/23/2018

Area Attention

Existing attention mechanisms, are mostly item-based in that a model is ...
research
09/13/2016

Multimodal Attention for Neural Machine Translation

The attention mechanism is an important part of the neural machine trans...
research
02/22/2021

Linear Transformers Are Secretly Fast Weight Memory Systems

We show the formal equivalence of linearised self-attention mechanisms a...
research
10/15/2018

Bringing back simplicity and lightliness into neural image captioning

Neural Image Captioning (NIC) or neural caption generation has attracted...
research
10/09/2017

What does Attention in Neural Machine Translation Pay Attention to?

Attention in neural machine translation provides the possibility to enco...
research
09/13/2022

Revisiting Neural Scaling Laws in Language and Vision

The remarkable progress in deep learning in recent years is largely driv...

Please sign up or login with your details

Forgot password? Click here to reset