ECA-Net: Efficient Channel Attention for Deep Convolutional Neural Networks

10/08/2019
by   Qilong Wang, et al.
0

Channel attention has recently demonstrated to offer great potential in improving the performance of deep convolutional neural networks (CNNs). However, most existing methods dedicate to developing more sophisticated attention modules to achieve better performance, inevitably increasing the computational burden. To overcome the paradox of performance and complexity trade-off, this paper makes an attempt to investigate an extremely lightweight attention module for boosting the performance of deep CNNs. In particular, we propose an Efficient Channel Attention (ECA) module, which only involves k (k < 9) parameters but brings clear performance gain. By revisiting the channel attention module in SENet, we empirically show avoiding dimensionality reduction and appropriate cross-channel interaction are important to learn effective channel attention. Therefore, we propose a local cross-channel interaction strategy without dimension reduction, which can be efficiently implemented by a fast 1D convolution. Furthermore, we develop a function of channel dimension to adaptively determine kernel size of 1D convolution, which stands for coverage of local cross-channel interaction. Our ECA module can be flexibly incorporated into existing CNN architectures, and the resulting CNNs are named by ECA-Net. We extensively evaluate the proposed ECA-Net on image classification, object detection and instance segmentation with backbones of ResNets and MobileNetV2. The experimental results show our ECA-Net is more efficient while performing favorably against its counterparts. The source code and models can be available at https://github.com/BangguWu/ECANet.

READ FULL TEXT

page 8

page 10

research
08/03/2022

EMC2A-Net: An Efficient Multibranch Cross-channel Attention Network for SAR Target Classification

In recent years, convolutional neural networks (CNNs) have shown great p...
research
01/30/2021

SA-Net: Shuffle Attention for Deep Convolutional Neural Networks

Attention mechanisms, which enable a neural network to accurately focus ...
research
03/10/2020

Channel Attention with Embedding Gaussian Process: A Probabilistic Methodology

Channel attention mechanisms, as the key components of some modern convo...
research
09/05/2022

LKD-Net: Large Kernel Convolution Network for Single Image Dehazing

The deep convolutional neural networks (CNNs)-based single image dehazin...
research
06/03/2021

CT-Net: Channel Tensorization Network for Video Classification

3D convolution is powerful for video classification but often computatio...
research
05/02/2022

Rethinking Gradient Operator for Exposing AI-enabled Face Forgeries

For image forensics, convolutional neural networks (CNNs) tend to learn ...
research
06/24/2019

Cross-Channel Correlation Preserved Three-Stream Lightweight CNNs for Demosaicking

Demosaicking is a procedure to reconstruct full RGB images from Color Fi...

Please sign up or login with your details

Forgot password? Click here to reset