Linear Attention Mechanism: An Efficient Attention for Semantic Segmentation

07/29/2020
by   Rui Li, et al.
121

In this paper, to remedy this deficiency, we propose a Linear Attention Mechanism which is approximate to dot-product attention with much less memory and computational costs. The efficient design makes the incorporation between attention mechanisms and neural networks more flexible and versatile. Experiments conducted on semantic segmentation demonstrated the effectiveness of linear attention mechanism. Code is available at https://github.com/lironui/Linear-Attention-Mechanism.

READ FULL TEXT
research
11/29/2020

Multi-stage Attention ResU-Net for Semantic Segmentation of Fine-Resolution Remote Sensing Images

The attention mechanism can refine the extracted feature maps and boost ...
research
04/28/2022

Attention Mechanism with Energy-Friendly Operations

Attention mechanism has become the dominant module in natural language p...
research
03/04/2021

Coordinate Attention for Efficient Mobile Network Design

Recent studies on mobile network design have demonstrated the remarkable...
research
01/26/2022

When Shift Operation Meets Vision Transformer: An Extremely Simple Alternative to Attention Mechanism

Attention mechanism has been widely believed as the key to success of vi...
research
05/30/2021

How Attentive are Graph Attention Networks?

Graph Attention Networks (GATs) are one of the most popular GNN architec...
research
03/15/2018

Aggregated Sparse Attention for Steering Angle Prediction

In this paper, we apply the attention mechanism to autonomous driving fo...
research
03/02/2021

HED-UNet: Combined Segmentation and Edge Detection for Monitoring the Antarctic Coastline

Deep learning-based coastline detection algorithms have begun to outshin...

Please sign up or login with your details

Forgot password? Click here to reset