Deep Learning using Rectified Linear Units (ReLU)

03/22/2018
by   Abien Fred Agarap, et al.
0

We introduce the use of rectified linear units (ReLU) as the classification function in a deep neural network (DNN). Conventionally, ReLU is used as an activation function in DNNs, with Softmax function as their classification function. However, there have been several studies on using a classification function other than Softmax, and this study is an addition to those. We accomplish this by taking the activation of the penultimate layer h_n - 1 in a neural network, then multiply it by weight parameters θ to get the raw scores o_i. Afterwards, we threshold the raw scores o_i by 0, i.e. f(o) = (0, o_i), where f(o) is the ReLU function. We provide class predictions ŷ through argmax function, i.e. argmax f(x).

READ FULL TEXT

page 3

page 4

page 5

page 6

page 7

research
02/10/2020

On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

In this paper, we have extended the well-established universal approxima...
research
03/22/2020

Dynamic ReLU

Rectified linear units (ReLU) are commonly used in deep neural networks....
research
12/28/2021

Reduced Softmax Unit for Deep Neural Network Accelerators

The Softmax activation layer is a very popular Deep Neural Network (DNN)...
research
05/22/2018

ARiA: Utilizing Richard's Curve for Controlling the Non-monotonicity of the Activation Function in Deep Neural Nets

This work introduces a novel activation unit that can be efficiently emp...
research
04/06/2018

A comparison of deep networks with ReLU activation function and linear spline-type methods

Deep neural networks (DNNs) generate much richer function spaces than sh...
research
12/07/2020

Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Renormalization Group

The success of deep learning in many real-world tasks has triggered an e...
research
10/13/2021

Clustering-Based Interpretation of Deep ReLU Network

Amongst others, the adoption of Rectified Linear Units (ReLUs) is regard...

Please sign up or login with your details

Forgot password? Click here to reset