ErfAct and PSerf: Non-monotonic smooth trainable Activation Functions

09/09/2021
by   Koushik Biswas, et al.
0

An activation function is a crucial component of a neural network that introduces non-linearity in the network. The state-of-the-art performance of a neural network depends on the perfect choice of an activation function. We propose two novel non-monotonic smooth trainable activation functions, called ErfAct and PSerf. Experiments suggest that the proposed functions improve the network performance significantly compared to the widely used activations like ReLU, Swish, and Mish. Replacing ReLU by ErfAct and PSerf, we have 5.21 5.04 dataset, 2.58 network in CIFAR10 dataset, 1.0 precision (mAP) on SSD300 model in Pascal VOC dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2021

SMU: smooth activation function for deep networks using smoothing maximum technique

Deep learning researchers have a keen interest in proposing two new nove...
research
06/17/2021

Orthogonal-Padé Activation Functions: Trainable Activation functions for smooth and faster convergence in deep networks

We have proposed orthogonal-Padé activation functions, which are trainab...
research
03/01/2020

Soft-Root-Sign Activation Function

The choice of activation function in deep networks has a significant eff...
research
05/24/2022

Constrained Monotonic Neural Networks

Deep neural networks are becoming increasingly popular in approximating ...
research
01/01/2016

Stochastic Neural Networks with Monotonic Activation Functions

We propose a Laplace approximation that creates a stochastic unit from a...
research
08/16/2019

Effect of Activation Functions on the Training of Overparametrized Neural Nets

It is well-known that overparametrized neural networks trained using gra...
research
11/02/2018

Efficient Neural Network Robustness Certification with General Activation Functions

Finding minimum distortion of adversarial examples and thus certifying r...

Please sign up or login with your details

Forgot password? Click here to reset