ALReLU: A different approach on Leaky ReLU activation function to improve Neural Networks Performance

Despite the unresolved 'dying ReLU problem', the classical ReLU activation function (AF) has been extensively applied in Deep Neural Networks (DNN), in particular Convolutional Neural Networks (CNN), for image classification. The common gradient issues of ReLU pose challenges in applications on academy and industry sectors. Recent approaches for improvements are in a similar direction by just proposing variations of the AF, such as Leaky ReLU (LReLU), while maintaining the solution within the same unresolved gradient problems. In this paper, the Absolute Leaky ReLU (ALReLU) AF, a variation of LReLU, is proposed, as an alternative method to resolve the common 'dying ReLU problem' on NN-based algorithms for supervised learning. The experimental results demonstrate that by using the absolute values of LReLU's small negative gradient, has a significant improvement in comparison with LReLU and ReLU, on image classification of diseases such as COVID-19, text and tabular data classification tasks on five different datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2020

QReLU and m-QReLU: Two novel quantum activation functions to aid medical diagnostics

The ReLU activation function (AF) has been extensively applied in deep n...
research
10/17/2022

Nish: A Novel Negative Stimulated Hybrid Activation Function

An activation function has a significant impact on the efficiency and ro...
research
11/15/2020

hyper-sinh: An Accurate and Reliable Function from Shallow to Deep Learning in TensorFlow and Keras

This paper presents the 'hyper-sinh', a variation of the m-arcsinh activ...
research
12/06/2018

Singular Values for ReLU Layers

Despite their prevalence in neural networks we still lack a thorough the...
research
08/09/2022

On the Activation Function Dependence of the Spectral Bias of Neural Networks

Neural networks are universal function approximators which are known to ...
research
01/28/2021

Reducing ReLU Count for Privacy-Preserving CNN Speedup

Privacy-Preserving Machine Learning algorithms must balance classificati...
research
04/02/2018

Average Biased ReLU Based CNN Descriptor for Improved Face Retrieval

The convolutional neural networks (CNN) like AlexNet, GoogleNet, VGGNet,...

Please sign up or login with your details

Forgot password? Click here to reset