PKCAM: Previous Knowledge Channel Attention Module

11/14/2022
by   Eslam Mohamed Bakar, et al.
0

Recently, attention mechanisms have been explored with ConvNets, both across the spatial and channel dimensions. However, from our knowledge, all the existing methods devote the attention modules to capture local interactions from a uni-scale. In this paper, we propose a Previous Knowledge Channel Attention Module(PKCAM), that captures channel-wise relations across different layers to model the global context. Our proposed module PKCAM is easily integrated into any feed-forward CNN architectures and trained in an end-to-end fashion with a negligible footprint due to its lightweight property. We validate our novel architecture through extensive experiments on image classification and object detection tasks with different backbones. Our experiments show consistent improvements in performances against their counterparts. Our code is published at https://github.com/eslambakr/EMCA.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/17/2018

CBAM: Convolutional Block Attention Module

We propose Convolutional Block Attention Module (CBAM), a simple yet eff...
research
03/28/2021

BA^2M: A Batch Aware Attention Module for Image Classification

The attention mechanisms have been employed in Convolutional Neural Netw...
research
11/24/2021

NAM: Normalization-based Attention Module

Recognizing less salient features is the key for model compression. Howe...
research
07/17/2018

BAM: Bottleneck Attention Module

Recent advances in deep neural networks have been developed via architec...
research
10/06/2020

Rotate to Attend: Convolutional Triplet Attention Module

Benefiting from the capability of building inter-dependencies among chan...
research
03/10/2020

Channel Attention with Embedding Gaussian Process: A Probabilistic Methodology

Channel attention mechanisms, as the key components of some modern convo...
research
09/06/2019

Linear Context Transform Block

Squeeze-and-Excitation (SE) block presents a channel attention mechanism...

Please sign up or login with your details

Forgot password? Click here to reset