Flexible Rectified Linear Units for Improving Convolutional Neural Networks

06/25/2017
by   Suo Qiu, et al.
0

Rectified linear unit (ReLU) is a widely used activation function for deep convolutional neural networks. In this paper, we propose a novel activation function called flexible rectified linear unit (FReLU). FReLU improves the flexibility of ReLU by a learnable rectified point. FReLU achieves a faster convergence and higher performance. Furthermore, FReLU does not rely on strict assumptions by self-adaption. FReLU is also simple and effective without using exponential function. We evaluate FReLU on two standard image classification dataset, including CIFAR-10 and CIFAR-100. Experimental results show the strengths of the proposed method.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2019

Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

Activation functions play a key role in providing remarkable performance...
research
05/05/2015

Empirical Evaluation of Rectified Activations in Convolutional Network

In this paper we investigate the performance of different types of recti...
research
05/11/2015

Improving neural networks with bunches of neurons modeled by Kumaraswamy units: Preliminary study

Deep neural networks have recently achieved state-of-the-art results in ...
research
06/01/2016

Improving Deep Neural Network with Multiple Parametric Exponential Linear Units

Activation function is crucial to the recent successes of deep neural ne...
research
12/09/2022

AP: Selective Activation for De-sparsifying Pruned Neural Networks

The rectified linear unit (ReLU) is a highly successful activation funct...
research
10/23/2021

Parametric Variational Linear Units (PVLUs) in Deep Convolutional Networks

The Rectified Linear Unit is currently a state-of-the-art activation fun...
research
05/22/2018

ARiA: Utilizing Richard's Curve for Controlling the Non-monotonicity of the Activation Function in Deep Neural Nets

This work introduces a novel activation unit that can be efficiently emp...

Please sign up or login with your details

Forgot password? Click here to reset