Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms

07/12/2021
by   Chenglin Yang, et al.
6

Self-Attention has become prevalent in computer vision models. Inspired by fully connected Conditional Random Fields (CRFs), we decompose it into local and context terms. They correspond to the unary and binary terms in CRF and are implemented by attention mechanisms with projection matrices. We observe that the unary terms only make small contributions to the outputs, and meanwhile standard CNNs that rely solely on the unary terms achieve great performances on a variety of tasks. Therefore, we propose Locally Enhanced Self-Attention (LESA), which enhances the unary term by incorporating it with convolutions, and utilizes a fusion module to dynamically couple the unary and binary operations. In our experiments, we replace the self-attention modules with LESA. The results on ImageNet and COCO show the superiority of LESA over convolution and self-attention baselines for the tasks of image recognition, object detection, and instance segmentation. The code is made publicly available.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2021

Lite Vision Transformer with Enhanced Self-Attention

Despite the impressive representation capacity of vision transformer mod...
research
04/28/2020

Exploring Self-attention for Image Recognition

Recent work has shown that self-attention can serve as a basic building ...
research
03/23/2021

Scaling Local Self-Attention For Parameter Efficient Visual Backbones

Self-attention has the promise of improving computer vision systems due ...
research
12/04/2018

Factorized Attention: Self-Attention with Linear Complexities

Recent works have been applying self-attention to various fields in comp...
research
01/29/2019

Pay Less Attention with Lightweight and Dynamic Convolutions

Self-attention is a useful mechanism to build generative models for lang...
research
12/03/2019

Multiscale Self Attentive Convolutions for Vision and Language Modeling

Self attention mechanisms have become a key building block in many state...
research
11/29/2021

On the Integration of Self-Attention and Convolution

Convolution and self-attention are two powerful techniques for represent...

Please sign up or login with your details

Forgot password? Click here to reset