Funnel Activation for Visual Recognition

07/23/2020
by   Ningning Ma, et al.
0

We present a conceptually simple but effective funnel activation for image recognition tasks, called Funnel activation (FReLU), that extends ReLU and PReLU to a 2D activation by adding a negligible overhead of spatial condition. The forms of ReLU and PReLU are y = max(x, 0) and y = max(x, px), respectively, while FReLU is in the form of y = max(x,T(x)), where T(x) is the 2D spatial condition. Moreover, the spatial condition achieves a pixel-wise modeling capacity in a simple way, capturing complicated visual layouts with regular convolutions. We conduct experiments on ImageNet, COCO detection, and semantic segmentation tasks, showing great improvements and robustness of FReLU in the visual recognition tasks. Code is available at https://github.com/megvii-model/FunnelAct.

READ FULL TEXT
research
06/08/2019

DiCENet: Dimension-wise Convolutions for Efficient Networks

In this paper, we propose a new CNN model DiCENet, that is built using: ...
research
12/17/2021

Adaptively Customizing Activation Functions for Various Layers

To enhance the nonlinearity of neural networks and increase their mappin...
research
06/21/2020

Sequential Feature Filtering Classifier

We propose Sequential Feature Filtering Classifier (FFC), a simple but e...
research
03/30/2021

Distribution Alignment: A Unified Framework for Long-tail Visual Recognition

Despite the recent success of deep neural networks, it remains challengi...
research
07/15/2020

Learning Visual Context by Comparison

Finding diseases from an X-ray image is an important yet highly challeng...
research
05/23/2019

Spatial Group-wise Enhance: Improving Semantic Feature Learning in Convolutional Networks

The Convolutional Neural Networks (CNNs) generate the feature representa...
research
04/25/2019

Local Relation Networks for Image Recognition

The convolution layer has been the dominant feature extractor in compute...

Please sign up or login with your details

Forgot password? Click here to reset