A ReLU Dense Layer to Improve the Performance of Neural Networks

10/22/2020
by   Alireza M. Javid, et al.
0

We propose ReDense as a simple and low complexity way to improve the performance of trained neural networks. We use a combination of random weights and rectified linear unit (ReLU) activation function to add a ReLU dense (ReDense) layer to the trained neural network such that it can achieve a lower training loss. The lossless flow property (LFP) of ReLU is the key to achieve the lower training loss while keeping the generalization error small. ReDense does not suffer from vanishing gradient problem in the training due to having a shallow structure. We experimentally show that ReDense can improve the training and testing performance of various neural network architectures with different optimization loss and activation functions. Finally, we test ReDense on some of the state-of-the-art architectures and show the performance improvement on benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2019

Natural-Logarithm-Rectified Activation Function in Convolutional Neural Networks

Activation functions play a key role in providing remarkable performance...
research
10/29/2020

Over-parametrized neural networks as under-determined linear systems

We draw connections between simple neural networks and under-determined ...
research
09/13/2019

Shapley Interpretation and Activation in Neural Networks

We propose a novel Shapley value approach to help address neural network...
research
02/23/2023

Testing Stationarity Concepts for ReLU Networks: Hardness, Regularity, and Robust Algorithms

We study the computational problem of the stationarity test for the empi...
research
12/09/2022

AP: Selective Activation for De-sparsifying Pruned Neural Networks

The rectified linear unit (ReLU) is a highly successful activation funct...
research
05/24/2019

Greedy Shallow Networks: A New Approach for Constructing and Training Neural Networks

We present a novel greedy approach to obtain a single layer neural netwo...

Please sign up or login with your details

Forgot password? Click here to reset