Switchable Self-attention Module

09/13/2022
by   Shanshan Zhong, et al.
0

Attention mechanism has gained great success in vision recognition. Many works are devoted to improving the effectiveness of attention mechanism, which finely design the structure of the attention operator. These works need lots of experiments to pick out the optimal settings when scenarios change, which consumes a lot of time and computational resources. In addition, a neural network often contains many network layers, and most studies often use the same attention module to enhance different network layers, which hinders the further improvement of the performance of the self-attention mechanism. To address the above problems, we propose a self-attention module SEM. Based on the input information of the attention module and alternative attention operators, SEM can automatically decide to select and integrate attention operators to compute attention maps. The effectiveness of SEM is demonstrated by extensive experiments on widely used benchmark datasets and popular self-attention networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/22/2022

Mix-Pooling Strategy for Attention Mechanism

Recently many effective self-attention modules are proposed to boot the ...
research
03/26/2022

Exploring Self-Attention for Visual Intersection Classification

In robot vision, self-attention has recently emerged as a technique for ...
research
02/13/2023

Learning to Scale Temperature in Masked Self-Attention for Image Inpainting

Recent advances in deep generative adversarial networks (GAN) and self-a...
research
04/29/2019

Self-Attention Capsule Networks for Image Classification

We propose a novel architecture for image classification, called Self-At...
research
07/23/2023

LoLep: Single-View View Synthesis with Locally-Learned Planes and Self-Attention Occlusion Inference

We propose a novel method, LoLep, which regresses Locally-Learned planes...
research
05/20/2019

Less Memory, Faster Speed: Refining Self-Attention Module for Image Reconstruction

Self-attention (SA) mechanisms can capture effectively global dependenci...
research
02/18/2020

Conditional Self-Attention for Query-based Summarization

Self-attention mechanisms have achieved great success on a variety of NL...

Please sign up or login with your details

Forgot password? Click here to reset