Deeper Learning with CoLU Activation

12/18/2021
by   Advait Vagerwal, et al.
0

In neural networks, non-linearity is introduced by activation functions. One commonly used activation function is Rectified Linear Unit (ReLU). ReLU has been a popular choice as an activation but has flaws. State-of-the-art functions like Swish and Mish are now gaining attention as a better choice as they combat many flaws presented by other activation functions. CoLU is an activation function similar to Swish and Mish in properties. It is defined as f(x)=x/(1-xe^-(x+e^x)). It is smooth, continuously differentiable, unbounded above, bounded below, non-saturating, and non-monotonic. Based on experiments done with CoLU with different activation functions, it is observed that CoLU usually performs better than other functions on deeper neural networks. While training different neural networks on MNIST on an incrementally increasing number of convolutional layers, CoLU retained the highest accuracy for more layers. On a smaller network with 8 convolutional layers, CoLU had the highest mean accuracy, closely followed by ReLU. On VGG-13 trained on Fashion-MNIST, CoLU had a 4.20 On ResNet-9 trained on Cifar-10, CoLU had 0.05 0.09 observed that activation functions may behave better than other activation functions based on different factors including the number of layers, types of layers, number of parameters, learning rate, optimizer, etc. Further research can be done on these factors and activation functions for more optimal activation functions and more knowledge on their behavior.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/24/2022

Activation Functions: Dive into an optimal activation function

Activation functions have come up as one of the essential components of ...
research
08/10/2023

Optimizing Performance of Feedforward and Convolutional Neural Networks through Dynamic Activation Functions

Deep learning training training algorithms are a huge success in recent ...
research
01/29/2018

Learning Combinations of Activation Functions

In the last decade, an active area of research has been devoted to desig...
research
05/28/2023

ASU-CNN: An Efficient Deep Architecture for Image Classification and Feature Visualizations

Activation functions play a decisive role in determining the capacity of...
research
11/13/2015

Learning to Assign Orientations to Feature Points

We show how to train a Convolutional Neural Network to assign a canonica...
research
02/11/2020

Goldilocks Neural Networks

We introduce the new "Goldilocks" class of activation functions, which n...
research
10/15/2020

Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks

The primary neural networks decision-making units are activation functio...

Please sign up or login with your details

Forgot password? Click here to reset