On Approximation Capabilities of ReLU Activation and Softmax Output Layer in Neural Networks

02/10/2020
by   Behnam Asadi, et al.
11

In this paper, we have extended the well-established universal approximator theory to neural networks that use the unbounded ReLU activation function and a nonlinear softmax output layer. We have proved that a sufficiently large neural network using the ReLU activation function can approximate any function in L^1 up to any arbitrary precision. Moreover, our theoretical results have shown that a large enough neural network using a nonlinear softmax output layer can also approximate any indicator function in L^1, which is equivalent to mutually-exclusive class labels in any realistic multiple-class pattern classification problems. To the best of our knowledge, this work is the first theoretical justification for using the softmax output layers in neural networks for pattern classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2018

Deep Learning using Rectified Linear Units (ReLU)

We introduce the use of rectified linear units (ReLU) as the classificat...
research
11/15/2018

Mathematical Analysis of Adversarial Attacks

In this paper, we analyze efficacy of the fast gradient sign method (FGS...
research
12/17/2020

Reduced Order Modeling using Shallow ReLU Networks with Grassmann Layers

This paper presents a nonlinear model reduction method for systems of eq...
research
06/26/2018

Towards an understanding of CNNs: analysing the recovery of activation pathways via Deep Convolutional Sparse Coding

Deep Convolutional Sparse Coding (D-CSC) is a framework reminiscent of d...
research
04/19/2023

Points of non-linearity of functions generated by random neural networks

We consider functions from the real numbers to the real numbers, output ...
research
12/02/2018

On variation of gradients of deep neural networks

We provide a theoretical explanation of the role of the number of nodes ...

Please sign up or login with your details

Forgot password? Click here to reset