First Power Linear Unit with Sign

11/29/2021
by   Boxi Duan, et al.
0

This paper proposes a novel and insightful activation method termed FPLUS, which exploits mathematical power function with polar signs in form. It is enlightened by common inverse operation while endowed with an intuitive meaning of bionics. The formulation is derived theoretically under conditions of some prior knowledge and anticipative properties, and then its feasibility is verified through a series of experiments using typical benchmark datasets, whose results indicate our approach owns superior competitiveness among numerous activation functions, as well as compatible stability across many CNN architectures. Furthermore, we extend the function presented to a more generalized type called PFPLUS with two parameters that can be fixed or learnable, so as to augment its expressive capacity, and outcomes of identical tests validate this improvement.

READ FULL TEXT
research
06/05/2020

Discovering Parametric Activation Functions

Recent studies have shown that the choice of activation function can sig...
research
05/20/2023

GELU Activation Function in Deep Learning: A Comprehensive Mathematical Analysis and Performance

Selecting the most suitable activation function is a critical factor in ...
research
01/25/2021

Parametric Rectified Power Sigmoid Units: Learning Nonlinear Neural Transfer Analytical Forms

The paper proposes representation functionals in a dual paradigm where l...
research
06/24/2020

AReLU: Attention-based Rectified Linear Unit

Element-wise activation functions play a critical role in deep neural ne...
research
12/22/2015

Deep Learning with S-shaped Rectified Linear Activation Units

Rectified linear activation units are important components for state-of-...
research
06/18/2023

Learn to Enhance the Negative Information in Convolutional Neural Network

This paper proposes a learnable nonlinear activation mechanism specifica...

Please sign up or login with your details

Forgot password? Click here to reset