ARiA: Utilizing Richard's Curve for Controlling the Non-monotonicity of the Activation Function in Deep Neural Nets

05/22/2018
by   Narendra Patwardhan, et al.
0

This work introduces a novel activation unit that can be efficiently employed in deep neural nets (DNNs) and performs significantly better than the traditional Rectified Linear Units (ReLU). The function developed is a two parameter version of the specialized Richard's Curve and we call it Adaptive Richard's Curve weighted Activation (ARiA). This function is non-monotonous, analogous to the newly introduced Swish, however allows a precise control over its non-monotonous convexity by varying the hyper-parameters. We first demonstrate the mathematical significance of the two parameter ARiA followed by its application to benchmark problems such as MNIST, CIFAR-10 and CIFAR-100, where we compare the performance with ReLU and Swish units. Our results illustrate a significantly superior performance on all these datasets, making ARiA a potential replacement for ReLU and other activations in DNNs.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

03/22/2018

Deep Learning using Rectified Linear Units (ReLU)

We introduce the use of rectified linear units (ReLU) as the classificat...
06/25/2017

Flexible Rectified Linear Units for Improving Convolutional Neural Networks

Rectified linear unit (ReLU) is a widely used activation function for de...
03/22/2020

Dynamic ReLU

Rectified linear units (ReLU) are commonly used in deep neural networks....
11/17/2017

xUnit: Learning a Spatial Activation Function for Efficient Image Restoration

In recent years, deep neural networks (DNNs) achieved unprecedented perf...
10/27/2018

A Methodology for Automatic Selection of Activation Functions to Design Hybrid Deep Neural Networks

Activation functions influence behavior and performance of DNNs. Nonline...
10/23/2021

Parametric Variational Linear Units (PVLUs) in Deep Convolutional Networks

The Rectified Linear Unit is currently a state-of-the-art activation fun...
05/20/2020

ReLU Code Space: A Basis for Rating Network Quality Besides Accuracy

We propose a new metric space of ReLU activation codes equipped with a t...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.