A Biologically Inspired Visual Working Memory for Deep Networks

01/09/2019
by   Ethan Harris, et al.
0

The ability to look multiple times through a series of pose-adjusted glimpses is fundamental to human vision. This critical faculty allows us to understand highly complex visual scenes. Short term memory plays an integral role in aggregating the information obtained from these glimpses and informing our interpretation of the scene. Computational models have attempted to address glimpsing and visual attention but have failed to incorporate the notion of memory. We introduce a novel, biologically inspired visual working memory architecture that we term the Hebb-Rosenblatt memory. We subsequently introduce a fully differentiable Short Term Attentive Working Memory model (STAWM) which uses transformational attention to learn a memory over each image it sees. The state of our Hebb-Rosenblatt memory is embedded in STAWM as the weights space of a layer. By projecting different queries through this layer we can obtain goal-oriented latent representations for tasks including classification and visual reconstruction. Our model obtains highly competitive classification performance on MNIST and CIFAR-10. As demonstrated through the CelebA dataset, to perform reconstruction the model learns to make a sequence of updates to a canvas which constitute a parts-based representation. Classification with the self supervised representation obtained from MNIST is shown to be in line with the state of the art models (none of which use a visual attention mechanism). Finally, we show that STAWM can be trained under the dual constraints of classification and reconstruction to provide an interpretable visual sketchpad which helps open the 'black-box' of deep learning.

READ FULL TEXT

page 8

page 9

page 16

page 17

research
08/20/2022

MemoNav: Selecting Informative Memories for Visual Navigation

Image-goal navigation is a challenging task, as it requires the agent to...
research
09/28/2018

Learning to Remember, Forget and Ignore using Attention Control in Memory

Typical neural networks with external memory do not effectively separate...
research
03/27/2016

Recurrent Mixture Density Network for Spatiotemporal Visual Attention

In many computer vision tasks, the relevant information to solve the pro...
research
05/18/2020

Hybrid Sequential Recommender via Time-aware Attentive Memory Network

Recommendation systems aim to assist users to discover most preferred co...
research
04/19/2022

Behind the Machine's Gaze: Biologically Constrained Neural Networks Exhibit Human-like Visual Attention

By and large, existing computational models of visual attention tacitly ...
research
07/17/2017

Deep Learning to Attend to Risk in ICU

Modeling physiological time-series in ICU is of high clinical importance...

Please sign up or login with your details

Forgot password? Click here to reset