TDAN: Top-Down Attention Networks for Enhanced Feature Selectivity in CNNs

11/26/2021
by   Shantanu Jaiswal, et al.
0

Attention modules for Convolutional Neural Networks (CNNs) are an effective method to enhance performance of networks on multiple computer-vision tasks. While many works focus on building more effective modules through appropriate modelling of channel-, spatial- and self-attention, they primarily operate in a feedfoward manner. Consequently, the attention mechanism strongly depends on the representational capacity of a single input feature activation, and can benefit from incorporation of semantically richer higher-level activations that can specify "what and where to look" through top-down information flow. Such feedback connections are also prevalent in the primate visual cortex and recognized by neuroscientists as a key component in primate visual attention. Accordingly, in this work, we propose a lightweight top-down (TD) attention module that iteratively generates a "visual searchlight" to perform top-down channel and spatial modulation of its inputs and consequently outputs more selective feature activations at each computation step. Our experiments indicate that integrating TD in CNNs enhances their performance on ImageNet-1k classification and outperforms prominent attention modules while being more parameter and memory efficient. Further, our models are more robust to changes in input resolution during inference and learn to "shift attention" by localizing individual objects or features at each computation step without any explicit supervision. This capability results in 5 weakly-supervised object localization besides improvements in fine-grained and multi-label classification.

READ FULL TEXT

page 2

page 8

research
01/24/2021

Grad-CAM guided channel-spatial attention module for fine-grained visual classification

Fine-grained visual classification (FGVC) is becoming an important resea...
research
12/10/2021

Global Attention Mechanism: Retain Information to Enhance Channel-Spatial Interactions

A variety of attention mechanisms have been studied to improve the perfo...
research
11/25/2022

Spatial-Temporal Attention Network for Open-Set Fine-Grained Image Recognition

Triggered by the success of transformers in various visual tasks, the sp...
research
03/18/2023

Spatial-Aware Token for Weakly Supervised Object Localization

Weakly supervised object localization (WSOL) is a challenging task aimin...
research
07/19/2018

Attend and Rectify: a Gated Attention Mechanism for Fine-Grained Recovery

We propose a novel attention mechanism to enhance Convolutional Neural N...
research
06/21/2020

Sequential Feature Filtering Classifier

We propose Sequential Feature Filtering Classifier (FFC), a simple but e...
research
05/24/2018

Uncertainty-Aware Attention for Reliable Interpretation and Prediction

Attention mechanism is effective in both focusing the deep learning mode...

Please sign up or login with your details

Forgot password? Click here to reset