Channel Locality Block: A Variant of Squeeze-and-Excitation

01/06/2019
by   Huayu Li, et al.
0

Attention mechanism is a hot spot in deep learning field. Using channel attention model is an effective method for improving the performance of the convolutional neural network. Squeeze-and-Excitation block takes advantage of the channel dependence, selectively emphasizing the important channels and compressing the relatively useless channel. In this paper, we proposed a variant of SE block based on channel locality. Instead of using full connection layers to explore the global channel dependence, we adopt convolutional layers to learn the correlation between the nearby channels. We term this new algorithm Channel Locality(C-Local) block. We evaluate SE block and C-Local block by applying them to different CNNs architectures on cifar-10 dataset. We observed that our C-Local block got higher accuracy than SE block did.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/06/2019

Linear Context Transform Block

Squeeze-and-Excitation (SE) block presents a channel attention mechanism...
research
07/05/2021

Tiled Squeeze-and-Excite: Channel Attention With Local Spatial Context

In this paper we investigate the amount of spatial context required for ...
research
12/17/2017

clcNet: Improving the Efficiency of Convolutional Neural Network using Channel Local Convolutions

Depthwise convolution and grouped convolution has been successfully appl...
research
11/04/2020

A Multi-Channel Temporal Attention Convolutional Neural Network Model for Environmental Sound Classification

Recently, many attention-based deep neural networks have emerged and ach...
research
10/18/2021

Abnormal Occupancy Grid Map Recognition using Attention Network

The occupancy grid map is a critical component of autonomous positioning...
research
09/05/2017

Squeeze-and-Excitation Networks

Convolutional neural networks are built upon the convolution operation, ...
research
09/25/2019

Gated Channel Transformation for Visual Recognition

In this work, we propose a generally applicable transformation unit for ...

Please sign up or login with your details

Forgot password? Click here to reset