Efficient Attention Network: Accelerate Attention by Searching Where to Plug

11/28/2020
by   Zhongzhan Huang, et al.
0

Recently, many plug-and-play self-attention modules are proposed to enhance the model generalization by exploiting the internal information of deep convolutional neural networks (CNNs). Previous works lay an emphasis on the design of attention module for specific functionality, e.g., light-weighted or task-oriented attention. However, they ignore the importance of where to plug in the attention module since they connect the modules individually with each block of the entire CNN backbone for granted, leading to incremental computational cost and number of parameters with the growth of network depth. Thus, we propose a framework called Efficient Attention Network (EAN) to improve the efficiency for the existing attention modules. In EAN, we leverage the sharing mechanism (Huang et al. 2020) to share the attention module within the backbone and search where to connect the shared attention module via reinforcement learning. Finally, we obtain the attention network with sparse connections between the backbone and modules, while (1) maintaining accuracy (2) reducing extra parameter increment and (3) accelerating inference. Extensive experiments on widely-used benchmarks and popular attention networks show the effectiveness of EAN. Furthermore, we empirically illustrate that our EAN has the capacity of transferring to other tasks and capturing the informative features. The code is available at https://github.com/gbup-group/EAN-efficient-attention-network

READ FULL TEXT
research
10/27/2022

Layer-wise Shared Attention Network on Dynamical System Perspective

Attention networks have successfully boosted accuracy in various vision ...
research
09/18/2020

An Enhanced Convolutional Neural Network in Side-Channel Attacks and Its Visualization

In recent years, the convolutional neural networks (CNNs) have received ...
research
07/13/2020

Deep Reinforced Attention Learning for Quality-Aware Visual Recognition

In this paper, we build upon the weakly-supervised generation mechanism ...
research
07/01/2021

Generic Event Boundary Detection Challenge at CVPR 2021 Technical Report: Cascaded Temporal Attention Network (CASTANET)

This report presents the approach used in the submission of Generic Even...
research
04/16/2021

Attention! Stay Focus!

We develop a deep convolutional neural networks(CNNs) to deal with the b...
research
10/27/2022

Deepening Neural Networks Implicitly and Locally via Recurrent Attention Strategy

More and more empirical and theoretical evidence shows that deepening ne...
research
11/04/2019

CANet: Cross-disease Attention Network for Joint Diabetic Retinopathy and Diabetic Macular Edema Grading

Diabetic retinopathy (DR) and diabetic macular edema (DME) are the leadi...

Please sign up or login with your details

Forgot password? Click here to reset