Squeeze-and-Excitation Networks

09/05/2017
by   Jie Hu, et al.
0

Convolutional neural networks are built upon the convolution operation, which extracts informative features by fusing spatial and channel-wise information together within local receptive fields. In order to boost the representational power of a network, much existing work has shown the benefits of enhancing spatial encoding. In this work, we focus on channels and propose a novel architectural unit, which we term the "Squeeze-and-Excitation" (SE) block, that adaptively recalibrates channel-wise feature responses by explicitly modelling interdependencies between channels. We demonstrate that by stacking these blocks together, we can construct SENet architectures that generalise extremely well across challenging datasets. Crucially, we find that SE blocks produce significant performance improvements for existing state-of-the-art deep architectures at slight computational cost. SENets formed the foundation of our ILSVRC 2017 classification submission which won first place and significantly reduced the top-5 error to 2.251 the winning entry of 2016.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/23/2018

Recalibrating Fully Convolutional Networks with Spatial and Channel 'Squeeze & Excitation' Blocks

In a wide range of semantic segmentation tasks, fully convolutional neur...
research
03/07/2018

Concurrent Spatial and Channel Squeeze & Excitation in Fully Convolutional Networks

Fully convolutional neural networks (F-CNNs) have set the state-of-the-a...
research
08/04/2019

Sound Event Detection in Multichannel Audio using Convolutional Time-Frequency-Channel Squeeze and Excitation

In this study, we introduce a convolutional time-frequency-channel "Sque...
research
01/06/2019

Channel Locality Block: A Variant of Squeeze-and-Excitation

Attention mechanism is a hot spot in deep learning field. Using channel ...
research
03/29/2022

ME-CapsNet: A Multi-Enhanced Capsule Networks with Routing Mechanism

Convolutional Neural Networks need the construction of informative featu...
research
09/06/2019

Linear Context Transform Block

Squeeze-and-Excitation (SE) block presents a channel attention mechanism...
research
07/05/2021

Tiled Squeeze-and-Excite: Channel Attention With Local Spatial Context

In this paper we investigate the amount of spatial context required for ...

Please sign up or login with your details

Forgot password? Click here to reset