EIS – a family of activation functions combining Exponential, ISRU, and Softplus

09/28/2020
by   Koushik Biswas, et al.
6

Activation functions play a pivotal role in the function learning using neural networks. The non-linearity in the learned function is achieved by repeated use of the activation function. Over the years, numerous activation functions have been proposed to improve accuracy in several tasks. Basic functions like ReLU, Sigmoid, Tanh, or Softplus have been favorite among the deep learning community because of their simplicity. In recent years, several novel activation functions arising from these basic functions have been proposed, which have improved accuracy in some challenging datasets. We propose a five hyper-parameters family of activation functions, namely EIS, defined as, x(ln(1+e^x))^α/√(β+γ x^2)+δ e^-θ x. We show examples of activation functions from the EIS family which outperform widely used activation functions on some well known datasets and models. For example, xln(1+e^x)/x+1.16e^-x beats ReLU by 0.89% in DenseNet-169, 0.24% in Inception V3 in CIFAR100 dataset while 1.13% in Inception V3, 0.13% in DenseNet-169, 0.94% in SimpleNet model in CIFAR10 dataset. Also, xln(1+e^x)/√(1+x^2) beats ReLU by 1.68% in DenseNet-169, 0.30% in Inception V3 in CIFAR100 dataset while 1.0% in Inception V3, 0.15% in DenseNet-169, 1.13% in SimpleNet model in CIFAR10 dataset.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2020

TanhSoft – a family of activation functions combining Tanh and Softplus

Deep learning at its core, contains functions that are composition of a ...
research
01/09/2019

Is it Time to Swish? Comparing Deep Learning Activation Functions Across NLP tasks

Activation functions play a crucial role in neural networks because they...
research
07/26/2020

Regularized Flexible Activation Function Combinations for Deep Neural Networks

Activation in deep neural networks is fundamental to achieving non-linea...
research
09/12/2017

Shifting Mean Activation Towards Zero with Bipolar Activation Functions

We propose a simple extension to the ReLU-family of activation functions...
research
10/22/2019

Improving Siamese Networks for One Shot Learning using Kernel Based Activation functions

The lack of a large amount of training data has always been the constrai...
research
01/17/2019

Activation Functions for Generalized Learning Vector Quantization - A Performance Comparison

An appropriate choice of the activation function (like ReLU, sigmoid or ...
research
07/13/2022

MorphoActivation: Generalizing ReLU activation function by mathematical morphology

This paper analyses both nonlinear activation functions and spatial max-...

Please sign up or login with your details

Forgot password? Click here to reset