Sequential Feature Filtering Classifier

06/21/2020
by   Minseok Seo, et al.
0

We propose Sequential Feature Filtering Classifier (FFC), a simple but effective classifier for convolutional neural networks (CNNs). With sequential LayerNorm and ReLU, FFC zeroes out low-activation units and preserves high-activation units. The sequential feature filtering process generates multiple features, which are fed into a shared classifier for multiple outputs. FFC can be applied to any CNNs with a classifier, and significantly improves performances with negligible overhead. We extensively validate the efficacy of FFC on various tasks: ImageNet-1K classification, MS COCO detection, Cityscapes segmentation, and HMDB51 action recognition. Moreover, we empirically show that FFC can further improve performances upon other techniques, including attention modules and augmentation techniques. The code and models will be publicly available.

READ FULL TEXT

page 2

page 10

research
07/17/2018

CBAM: Convolutional Block Attention Module

We propose Convolutional Block Attention Module (CBAM), a simple yet eff...
research
07/23/2020

Funnel Activation for Visual Recognition

We present a conceptually simple but effective funnel activation for ima...
research
08/10/2021

Exploiting Features with Split-and-Share Module

Deep convolutional neural networks (CNNs) have shown state-of-the-art pe...
research
07/17/2018

BAM: Bottleneck Attention Module

Recent advances in deep neural networks have been developed via architec...
research
11/26/2021

TDAN: Top-Down Attention Networks for Enhanced Feature Selectivity in CNNs

Attention modules for Convolutional Neural Networks (CNNs) are an effect...
research
08/22/2018

Multi-Grained-Attention Gated Convolutional Neural Networks for Sentence Classification

The classification task of sentences is very challenging because of the ...

Please sign up or login with your details

Forgot password? Click here to reset