TaLU: A Hybrid Activation Function Combining Tanh and Rectified Linear Unit to Enhance Neural Networks

05/08/2023
by   Md. Mehedi Hasan, et al.
0

The application of the deep learning model in classification plays an important role in the accurate detection of the target objects. However, the accuracy is affected by the activation function in the hidden and output layer. In this paper, an activation function called TaLU, which is a combination of Tanh and Rectified Linear Units (ReLU), is used to improve the prediction. ReLU activation function is used by many deep learning researchers for its computational efficiency, ease of implementation, intuitive nature, etc. However, it suffers from a dying gradient problem. For instance, when the input is negative, its output is always zero because its gradient is zero. A number of researchers used different approaches to solve this issue. Some of the most notable are LeakyReLU, Softplus, Softsign, Elu, ThresholdedReLU, etc. This research developed TaLU, a modified activation function combining Tanh and ReLU, which mitigates the dying gradient problem of ReLU. The deep learning model with the proposed activation function was tested on MNIST and CIFAR-10, and it outperforms ReLU and some other studied activation functions in terms of accuracy(from 0% upto 6% in most cases, when used with Batch Normalization and a reasonable learning rate).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/10/2022

APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

Activation Functions introduce non-linearity in the deep neural networks...
research
02/01/2018

Training Neural Networks by Using Power Linear Units (PoLUs)

In this paper, we introduce "Power Linear Unit" (PoLU) which increases t...
research
07/26/2018

Effectiveness of Scaled Exponentially-Regularized Linear Units (SERLUs)

Recently, self-normalizing neural networks (SNNs) have been proposed wit...
research
11/27/2021

AIS: A nonlinear activation function for industrial safety engineering

In the task of Chinese named entity recognition based on deep learning, ...
research
06/26/2018

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of d...
research
11/10/2019

Symmetrical Gaussian Error Linear Units (SGELUs)

In this paper, a novel neural network activation function, called Symmet...
research
02/13/2021

On the convergence of group-sparse autoencoders

Recent approaches in the theoretical analysis of model-based deep learni...

Please sign up or login with your details

Forgot password? Click here to reset