An Analysis of State-of-the-art Activation Functions For Supervised Deep Neural Network

04/05/2021
by   Anh Nguyen, et al.
0

This paper provides an analysis of state-of-the-art activation functions with respect to supervised classification of deep neural network. These activation functions comprise of Rectified Linear Units (ReLU), Exponential Linear Unit (ELU), Scaled Exponential Linear Unit (SELU), Gaussian Error Linear Unit (GELU), and the Inverse Square Root Linear Unit (ISRLU). To evaluate, experiments over two deep learning network architectures integrating these activation functions are conducted. The first model, basing on Multilayer Perceptron (MLP), is evaluated with MNIST dataset to perform these activation functions. Meanwhile, the second model, likely VGGish-based architecture, is applied for Acoustic Scene Classification (ASC) Task 1A in DCASE 2018 challenge, thus evaluate whether these activation functions work well in different datasets as well as different network architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2020

Discovering Parametric Activation Functions

Recent studies have shown that the choice of activation function can sig...
research
06/28/2023

Empirical Loss Landscape Analysis of Neural Network Activation Functions

Activation functions play a significant role in neural network design by...
research
03/19/2018

Deep learning improved by biological activation functions

`Biologically inspired' activation functions, such as the logistic sigmo...
research
01/04/2022

An unfeasability view of neural network learning

We define the notion of a continuously differentiable perfect learning a...
research
12/22/2015

Deep Learning with S-shaped Rectified Linear Activation Units

Rectified linear activation units are important components for state-of-...
research
06/02/2022

Exponential Separations in Symmetric Neural Networks

In this work we demonstrate a novel separation between symmetric neural ...
research
06/24/2022

Neural Networks with A La Carte Selection of Activation Functions

Activation functions (AFs), which are pivotal to the success (or failure...

Please sign up or login with your details

Forgot password? Click here to reset