Visual Attention Emerges from Recurrent Sparse Reconstruction

04/23/2022
by   Baifeng Shi, et al.
6

Visual attention helps achieve robust perception under noise, corruption, and distribution shifts in human vision, which are areas where modern neural networks still fall short. We present VARS, Visual Attention from Recurrent Sparse reconstruction, a new attention formulation built on two prominent features of the human visual attention mechanism: recurrency and sparsity. Related features are grouped together via recurrent connections between neurons, with salient objects emerging via sparse regularization. VARS adopts an attractor network with recurrent connections that converges toward a stable pattern over time. Network layers are represented as ordinary differential equations (ODEs), formulating attention as a recurrent attractor network that equivalently optimizes the sparse reconstruction of input using a dictionary of "templates" encoding underlying patterns of data. We show that self-attention is a special case of VARS with a single-step optimization and no sparsity constraint. VARS can be readily used as a replacement for self-attention in popular vision transformers, consistently improving their robustness across various benchmarks. Code is released on GitHub (https://github.com/bfshi/VARS).

READ FULL TEXT

page 6

page 8

page 13

page 14

page 15

page 16

page 17

page 18

research
04/26/2022

Understanding The Robustness in Vision Transformers

Recent studies show that Vision Transformers(ViTs) exhibit strong robust...
research
03/23/2023

Top-Down Visual Attention from Analysis by Synthesis

Current attention algorithms (e.g., self-attention) are stimulus-driven ...
research
07/21/2022

Multi Resolution Analysis (MRA) for Approximate Self-Attention

Transformers have emerged as a preferred model for many tasks in natural...
research
05/08/2022

SparseTT: Visual Tracking with Sparse Transformers

Transformers have been successfully applied to the visual tracking task ...
research
02/08/2023

Cross-Layer Retrospective Retrieving via Layer Attention

More and more evidence has shown that strengthening layer interactions c...
research
02/14/2020

Electricity Theft Detection with self-attention

In this work we propose a novel self-attention mechanism model to addres...
research
08/26/2021

Glimpse-Attend-and-Explore: Self-Attention for Active Visual Exploration

Active visual exploration aims to assist an agent with a limited field o...

Please sign up or login with your details

Forgot password? Click here to reset