Activation Functions in Artificial Neural Networks: A Systematic Overview

01/25/2021
by   Johannes Lederer, et al.
1

Activation functions shape the outputs of artificial neurons and, therefore, are integral parts of neural networks in general and deep learning in particular. Some activation functions, such as logistic and relu, have been used for many decades. But with deep learning becoming a mainstream research topic, new activation functions have mushroomed, leading to confusion in both theory and practice. This paper provides an analytic yet up-to-date overview of popular activation functions and their properties, which makes it a timely resource for anyone who studies or applies neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/17/2019

Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...
research
04/06/2022

A survey on recently proposed activation functions for Deep Learning

Artificial neural networks (ANN), typically referred to as neural networ...
research
01/10/2022

Quantum activation functions for quantum neural networks

The field of artificial neural networks is expected to strongly benefit ...
research
11/08/2018

Activation Functions: Comparison of trends in Practice and Research for Deep Learning

Deep neural networks have been successfully used in diverse emerging dom...
research
01/17/2020

Approximating Activation Functions

ReLU is widely seen as the default choice for activation functions in ne...
research
09/29/2021

A Comprehensive Survey and Performance Analysis of Activation Functions in Deep Learning

Neural networks have shown tremendous growth in recent years to solve nu...
research
03/28/2023

Function Approximation with Randomly Initialized Neural Networks for Approximate Model Reference Adaptive Control

Classical results in neural network approximation theory show how arbitr...

Please sign up or login with your details

Forgot password? Click here to reset