Parametric Rectified Power Sigmoid Units: Learning Nonlinear Neural Transfer Analytical Forms

The paper proposes representation functionals in a dual paradigm where learning jointly concerns both linear convolutional weights and parametric forms of nonlinear activation functions. The nonlinear forms proposed for performing the functional representation are associated with a new class of parametric neural transfer functions called rectified power sigmoid units. This class is constructed to integrate both advantages of sigmoid and rectified linear unit functions, in addition with rejecting the drawbacks of these functions. Moreover, the analytic form of this new neural class involves scale, shift and shape parameters so as to obtain a wide range of activation shapes, including the standard rectified linear unit as a limit case. Parameters of this neural transfer class are considered as learnable for the sake of discovering the complex shapes that can contribute in solving machine learning issues. Performance achieved by the joint learning of convolutional and rectified power sigmoid learnable parameters are shown outstanding in both shallow and deep learning frameworks. This class opens new prospects with respect to machine learning in the sense that learnable parameters are not only attached to linear transformations, but also to suitable nonlinear operators.

READ FULL TEXT
research
12/22/2015

Deep Learning with S-shaped Rectified Linear Activation Units

Rectified linear activation units are important components for state-of-...
research
10/11/2021

Parameterizing Activation Functions for Adversarial Robustness

Deep neural networks are known to be vulnerable to adversarially perturb...
research
06/16/2020

SPLASH: Learnable Activation Functions for Improving Accuracy and Adversarial Robustness

We introduce SPLASH units, a class of learnable activation functions sho...
research
11/29/2021

First Power Linear Unit with Sign

This paper proposes a novel and insightful activation method termed FPLU...
research
06/24/2020

AReLU: Attention-based Rectified Linear Unit

Element-wise activation functions play a critical role in deep neural ne...
research
11/17/2017

xUnit: Learning a Spatial Activation Function for Efficient Image Restoration

In recent years, deep neural networks (DNNs) achieved unprecedented perf...
research
09/06/2022

Towards non-linear quadrature formulae

Prompted by an observation about the integral of exponential functions o...

Please sign up or login with your details

Forgot password? Click here to reset