APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

09/10/2022
by   Ravin Kumar, et al.
0

Activation Functions introduce non-linearity in the deep neural networks. This nonlinearity helps the neural networks learn faster and efficiently from the dataset. In deep learning, many activation functions are developed and used based on the type of problem statement. ReLU's variants, SWISH, and MISH are goto activation functions. MISH function is considered having similar or even better performance than SWISH, and much better than ReLU. In this paper, we propose an activation function named APTx which behaves similar to MISH, but requires lesser mathematical operations to compute. The lesser computational requirements of APTx does speed up the model training, and thus also reduces the hardware requirement for the deep learning model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2022

Activation Functions: Dive into an optimal activation function

Activation functions have come up as one of the essential components of ...
research
05/08/2023

TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks

The application of the deep learning model in classification plays an im...
research
01/13/2021

Reproducing Activation Function for Deep Learning

In this paper, we propose the reproducing activation function to improve...
research
11/28/2018

The SWAG Algorithm; a Mathematical Approach that Outperforms Traditional Deep Learning. Theory and Implementation

The performance of artificial neural networks (ANNs) is influenced by we...
research
08/21/2021

SERF: Towards better training of deep neural networks using log-Softplus ERror activation Function

Activation functions play a pivotal role in determining the training dyn...
research
04/08/2016

Norm-preserving Orthogonal Permutation Linear Unit Activation Functions (OPLU)

We propose a novel activation function that implements piece-wise orthog...
research
08/16/2019

Effect of Activation Functions on the Training of Overparametrized Neural Nets

It is well-known that overparametrized neural networks trained using gra...

Please sign up or login with your details

Forgot password? Click here to reset