Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

08/10/2019
by   Yang Liu, et al.
15

Activation functions play a key role in providing remarkable performance in deep neural networks, and the rectified linear unit (ReLU) is one of the most widely used activation functions. Various new activation functions and improvements on ReLU have been proposed, but each carry performance drawbacks. In this paper, we propose an improved activation function, which we name the natural-logarithm-rectified linear unit (NLReLU). This activation function uses the parametric natural logarithmic transform to improve ReLU and is simply defined as. NLReLU not only retains the sparse activation characteristic of ReLU, but it also alleviates the "dying ReLU" and vanishing gradient problems to some extent. It also reduces the bias shift effect and heteroscedasticity of neuron data distributions among network layers in order to accelerate the learning process. The proposed method was verified across ten convolutional neural networks with different depths for two essential datasets. Experiments illustrate that convolutional neural networks with NLReLU exhibit higher accuracy than those with ReLU, and that NLReLU is comparable to other well-known activation functions. NLReLU provides 0.16 classification accuracy on average compared to ReLU when used in shallow convolutional neural networks with the MNIST and CIFAR-10 datasets, respectively. The average accuracy of deep convolutional neural networks with NLReLU is 1.35

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/15/2018

Flatten-T Swish: a thresholded ReLU-Swish-like activation function for deep learning

Activation functions are essential for deep learning methods to learn an...
research
05/07/2019

Ensemble of Convolutional Neural Networks Trained with Different Activation Functions

Activation functions play a vital role in the training of Convolutional ...
research
06/25/2017

Flexible Rectified Linear Units for Improving Convolutional Neural Networks

Rectified linear unit (ReLU) is a widely used activation function for de...
research
05/30/2016

Parametric Exponential Linear Unit for Deep Convolutional Neural Networks

The activation function is an important component in Convolutional Neura...
research
10/22/2020

A ReLU Dense Layer to Improve the Performance of Neural Networks

We propose ReDense as a simple and low complexity way to improve the per...
research
09/13/2019

Shapley Interpretation and Activation in Neural Networks

We propose a novel Shapley value approach to help address neural network...
research
06/24/2019

Variations on the Chebyshev-Lagrange Activation Function

We seek to improve the data efficiency of neural networks and present no...

Please sign up or login with your details

Forgot password? Click here to reset