DeepAI
Log In Sign Up

Activation Functions in Artificial Neural Networks: A Systematic Overview

01/25/2021
by   Johannes Lederer, et al.
1

Activation functions shape the outputs of artificial neurons and, therefore, are integral parts of neural networks in general and deep learning in particular. Some activation functions, such as logistic and relu, have been used for many decades. But with deep learning becoming a mainstream research topic, new activation functions have mushroomed, leading to confusion in both theory and practice. This paper provides an analytic yet up-to-date overview of popular activation functions and their properties, which makes it a timely resource for anyone who studies or applies neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/17/2019

Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...
04/06/2022

A survey on recently proposed activation functions for Deep Learning

Artificial neural networks (ANN), typically referred to as neural networ...
01/10/2022

Quantum activation functions for quantum neural networks

The field of artificial neural networks is expected to strongly benefit ...
11/08/2018

Activation Functions: Comparison of trends in Practice and Research for Deep Learning

Deep neural networks have been successfully used in diverse emerging dom...
01/17/2020

Approximating Activation Functions

ReLU is widely seen as the default choice for activation functions in ne...
09/29/2021

A Comprehensive Survey and Performance Analysis of Activation Functions in Deep Learning

Neural networks have shown tremendous growth in recent years to solve nu...
11/11/2020

Domain Wall Leaky Integrate-and-Fire Neurons with Shape-Based Configurable Activation Functions

Complementary metal oxide semiconductor (CMOS) devices display volatile ...