Saturated Non-Monotonic Activation Functions

05/12/2023
by   Junjia Chen, et al.
0

Activation functions are essential to deep learning networks. Popular and versatile activation functions are mostly monotonic functions, some non-monotonic activation functions are being explored and show promising performance. But by introducing non-monotonicity, they also alter the positive input, which is proved to be unnecessary by the success of ReLU and its variants. In this paper, we double down on the non-monotonic activation functions' development and propose the Saturated Gaussian Error Linear Units by combining the characteristics of ReLU and non-monotonic activation functions. We present three new activation functions built with our proposed method: SGELU, SSiLU, and SMish, which are composed of the negative portion of GELU, SiLU, and Mish, respectively, and ReLU's positive portion. The results of image classification experiments on CIFAR-100 indicate that our proposed activation functions are highly effective and outperform state-of-the-art baselines across multiple deep learning architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/17/2020

Evolutionary Optimization of Deep Learning Activation Functions

The choice of activation function can have a large effect on the perform...
research
02/16/2022

Learning a Single Neuron for Non-monotonic Activation Functions

We study the problem of learning a single neuron 𝐱↦σ(𝐰^T𝐱) with gradient...
research
03/19/2018

Deep learning improved by biological activation functions

`Biologically inspired' activation functions, such as the logistic sigmo...
research
05/28/2023

ASU-CNN: An Efficient Deep Architecture for Image Classification and Feature Visualizations

Activation functions play a decisive role in determining the capacity of...
research
02/11/2020

Goldilocks Neural Networks

We introduce the new "Goldilocks" class of activation functions, which n...
research
06/24/2022

Neural Networks with A La Carte Selection of Activation Functions

Activation functions (AFs), which are pivotal to the success (or failure...
research
05/29/2019

An Inertial Newton Algorithm for Deep Learning

We devise a learning algorithm for possibly nonsmooth deep neural networ...

Please sign up or login with your details

Forgot password? Click here to reset