CAT: Learning to Collaborate Channel and Spatial Attention from Multi-Information Fusion

12/13/2022
by   Zizhang Wu, et al.
0

Channel and spatial attention mechanism has proven to provide an evident performance boost of deep convolution neural networks (CNNs). Most existing methods focus on one or run them parallel (series), neglecting the collaboration between the two attentions. In order to better establish the feature interaction between the two types of attention, we propose a plug-and-play attention module, which we term "CAT"-activating the Collaboration between spatial and channel Attentions based on learned Traits. Specifically, we represent traits as trainable coefficients (i.e., colla-factors) to adaptively combine contributions of different attention modules to fit different image hierarchies and tasks better. Moreover, we propose the global entropy pooling (GEP) apart from global average pooling (GAP) and global maximum pooling (GMP) operators, an effective component in suppressing noise signals by measuring the information disorder of feature maps. We introduce a three-way pooling operation into attention modules and apply the adaptive mechanism to fuse their outcomes. Extensive experiments on MS COCO, Pascal-VOC, Cifar-100, and ImageNet show that our CAT outperforms existing state-of-the-art attention mechanisms in object detection, instance segmentation, and image classification. The model and code will be released soon.

READ FULL TEXT

page 1

page 4

page 5

research
06/13/2021

DMSANet: Dual Multi Scale Attention Network

Attention mechanism of late has been quite popular in the computer visio...
research
12/22/2020

FcaNet: Frequency Channel Attention Networks

Attention mechanism, especially channel attention, has gained great succ...
research
04/22/2019

Stochastic Region Pooling: Make Attention More Expressive

Global Average Pooling (GAP) is used by default on the channel-wise atte...
research
01/30/2021

SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

Attention mechanisms, which enable a neural network to accurately focus ...
research
05/23/2023

Efficient Multi-Scale Attention Module with Cross-Spatial Learning

Remarkable effectiveness of the channel or spatial attention mechanisms ...
research
03/04/2021

Coordinate Attention for Efficient Mobile Network Design

Recent studies on mobile network design have demonstrated the remarkable...
research
09/06/2019

Linear Context Transform Block

Squeeze-and-Excitation (SE) block presents a channel attention mechanism...

Please sign up or login with your details

Forgot password? Click here to reset