Towards Understanding the Effectiveness of Attention Mechanism

06/29/2021
by   Xiang Ye, et al.
0

Attention Mechanism is a widely used method for improving the performance of convolutional neural networks (CNNs) on computer vision tasks. Despite its pervasiveness, we have a poor understanding of what its effectiveness stems from. It is popularly believed that its effectiveness stems from the visual attention explanation, advocating focusing on the important part of input data rather than ingesting the entire input. In this paper, we find that there is only a weak consistency between the attention weights of features and their importance. Instead, we verify the crucial role of feature map multiplication in attention mechanism and uncover a fundamental impact of feature map multiplication on the learned landscapes of CNNs: with the high order non-linearity brought by the feature map multiplication, it played a regularization role on CNNs, which made them learn smoother and more stable landscapes near real samples compared to vanilla CNNs. This smoothness and stability induce a more predictive and stable behavior in-between real samples, and make CNNs generate better. Moreover, motivated by the proposed effectiveness of feature map multiplication, we design feature map multiplication network (FMMNet) by simply replacing the feature map addition in ResNet with feature map multiplication. FMMNet outperforms ResNet on various datasets, and this indicates that feature map multiplication plays a vital role in improving the performance even without finely designed attention mechanism in existing methods.

READ FULL TEXT
research
05/03/2021

LFI-CAM: Learning Feature Importance for Better Visual Explanation

Class Activation Mapping (CAM) is a powerful technique used to understan...
research
01/18/2023

TAME: Attention Mechanism Based Feature Fusion for Generating Explanation Maps of Convolutional Neural Networks

The apparent “black box” nature of neural networks is a barrier to adopt...
research
05/30/2019

A Trainable Multiplication Layer for Auto-correlation and Co-occurrence Extraction

In this paper, we propose a trainable multiplication layer (TML) for a n...
research
05/07/2019

Intentional Attention Mask Transformation for Robust CNN Classification

Convolutional Neural Networks have achieved impressive results in variou...
research
04/05/2019

Relation-Aware Global Attention

Attention mechanism aims to increase the representation power by focusin...
research
04/15/2021

Attentive Max Feature Map for Acoustic Scene Classification with Joint Learning considering the Abstraction of Classes

The attention mechanism has been widely adopted in acoustic scene classi...
research
12/19/2022

Uncovering the Origins of Instability in Dynamical Systems: How Attention Mechanism Can Help?

The behavior of the network and its stability are governed by both dynam...

Please sign up or login with your details

Forgot password? Click here to reset