Empirical Evaluation of Rectified Activations in Convolutional Network

05/05/2015
by   Bing Xu, et al.
0

In this paper we investigate the performance of different types of rectified activation functions in convolutional neural network: standard rectified linear unit (ReLU), leaky rectified linear unit (Leaky ReLU), parametric rectified linear unit (PReLU) and a new randomized leaky rectified linear units (RReLU). We evaluate these activation function on standard image classification task. Our experiments suggest that incorporating a non-zero slope for negative part in rectified activation units could consistently improve the results. Thus our findings are negative on the common belief that sparsity is the key of good performance in ReLU. Moreover, on small scale dataset, using deterministic negative slope or learning it are both prone to overfitting. They are not as effective as using their randomized counterpart. By using RReLU, we achieved 75.68% accuracy on CIFAR-100 test set without multiple test or ensemble.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2017

Flexible Rectified Linear Units for Improving Convolutional Neural Networks

Rectified linear unit (ReLU) is a widely used activation function for de...
research
08/23/2019

Mish: A Self Regularized Non-Monotonic Neural Activation Function

The concept of non-linearity in a Neural Network is introduced by an act...
research
10/16/2017

Searching for Activation Functions

The choice of activation functions in deep networks has a significant ef...
research
11/24/2020

Comparisons among different stochastic selection of activation layers for convolutional neural networks for healthcare

Classification of biological images is an important task with crucial ap...
research
07/26/2018

Effectiveness of Scaled Exponentially-Regularized Linear Units (SERLUs)

Recently, self-normalizing neural networks (SNNs) have been proposed wit...
research
08/22/2018

An Attention-Gated Convolutional Neural Network for Sentence Classification

The classification task of sentences is very challenging because of the ...
research
07/28/2022

PEA: Improving the Performance of ReLU Networks for Free by Using Progressive Ensemble Activations

In recent years novel activation functions have been proposed to improve...

Please sign up or login with your details

Forgot password? Click here to reset