Deep Network Approximation: Beyond ReLU to Diverse Activation Functions

07/13/2023
βˆ™
by   Shijun Zhang, et al.
βˆ™
0
βˆ™

This paper explores the expressive power of deep neural networks for a diverse range of activation functions. An activation function set π’œ is defined to encompass the majority of commonly used activation functions, such as πšπšŽπ™»πš„, π™»πšŽπšŠπš”πš’πšπšŽπ™»πš„, πšπšŽπ™»πš„^2, π™΄π™»πš„, πš‚π™΄π™»πš„, πš‚πš˜πšπšπš™πš•πšžπšœ, π™Άπ™΄π™»πš„, πš‚πš’π™»πš„, πš‚πš πš’πšœπš‘, π™Όπš’πšœπš‘, πš‚πš’πšπš–πš˜πš’πš, πšƒπšŠπš—πš‘, π™°πš›πšŒπšπšŠπš—, πš‚πš˜πšπšπšœπš’πšπš—, πšπš‚πš’π™»πš„, and πš‚πšπš‚. We demonstrate that for any activation function Ο±βˆˆπ’œ, a πšπšŽπ™»πš„ network of width N and depth L can be approximated to arbitrary precision by a Ο±-activated network of width 6N and depth 2L on any bounded set. This finding enables the extension of most approximation results achieved with πšπšŽπ™»πš„ networks to a wide variety of other activation functions, at the cost of slightly larger constants.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
βˆ™ 06/17/2019

Smooth function approximation by deep neural networks with general activation functions

There has been a growing interest in expressivity of deep neural network...
research
βˆ™ 07/21/2020

Activation function dependence of the storage capacity of treelike neural networks

The expressive power of artificial neural networks crucially depends on ...
research
βˆ™ 05/04/2022

Most Activation Functions Can Win the Lottery Without Excessive Depth

The strong lottery ticket hypothesis has highlighted the potential for t...
research
βˆ™ 08/02/2018

The Quest for the Golden Activation Function

Deep Neural Networks have been shown to be beneficial for a variety of t...
research
βˆ™ 06/27/2022

Expressive power of binary and ternary neural networks

We show that deep sparse ReLU networks with ternary weights and deep ReL...
research
βˆ™ 10/15/2020

Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks

The primary neural networks decision-making units are activation functio...
research
βˆ™ 06/23/2019

Learning Activation Functions: A new paradigm of understanding Neural Networks

There has been limited research in the domain of activation functions, m...

Please sign up or login with your details

Forgot password? Click here to reset