Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks

10/15/2020
by   Tomasz Szandała, et al.
0

The primary neural networks decision-making units are activation functions. Moreover, they evaluate the output of networks neural node; thus, they are essential for the performance of the whole network. Hence, it is critical to choose the most appropriate activation function in neural networks calculation. Acharya et al. (2018) suggest that numerous recipes have been formulated over the years, though some of them are considered deprecated these days since they are unable to operate properly under some conditions. These functions have a variety of characteristics, which are deemed essential to successfully learning. Their monotonicity, individual derivatives, and finite of their range are some of these characteristics (Bach 2017). This research paper will evaluate the commonly used additive functions, such as swish, ReLU, Sigmoid, and so forth. This will be followed by their properties, own cons and pros, and particular formula application recommendations.

READ FULL TEXT
research
07/13/2023

Deep Network Approximation: Beyond ReLU to Diverse Activation Functions

This paper explores the expressive power of deep neural networks for a d...
research
12/18/2021

Deeper Learning with CoLU Activation

In neural networks, non-linearity is introduced by activation functions....
research
06/26/2018

Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks

The most widely used activation functions in current deep feed-forward n...
research
05/25/2023

Embeddings between Barron spaces with higher order activation functions

The approximation properties of infinitely wide shallow neural networks ...
research
10/17/2020

Squashing activation functions in benchmark tests: towards eXplainable Artificial Intelligence using continuous-valued logic

Over the past few years, deep neural networks have shown excellent resul...
research
10/13/2019

Large Deviation Analysis of Function Sensitivity in Random Deep Neural Networks

Mean field theory has been successfully used to analyze deep neural netw...
research
04/29/2022

Wide and Deep Neural Networks Achieve Optimality for Classification

While neural networks are used for classification tasks across domains, ...

Please sign up or login with your details

Forgot password? Click here to reset