E-swish: Adjusting Activations to Different Network Depths

01/22/2018
by   Eric Alcaide, et al.
1

Activation functions have a notorious impact on neural networks on both training and testing the models against the desired problem. Currently, the most used activation function is the Rectified Linear Unit (ReLU). This paper introduces a new and novel activation function, closely related with the new activation Swish = x * sigmoid(x) (Ramachandran et al., 2017) which generalizes it. We call the new activation E-swish = β x * sigmoid(x). We show that E-swish outperforms many other well-known activations including both ReLU and Swish. For example, using E-swish provided 1.5 improvements on Cifar10 and Cifar100 respectively for the WRN 10-2 when compared to ReLU and 0.35 code to reproduce all our experiments can be found at https://github.com/EricAlcaide/E-swish

READ FULL TEXT
research
12/27/2019

Learning Neural Activations

An artificial neuron is modelled as a weighted summation followed by an ...
research
06/04/2020

Overcoming Overfitting and Large Weight Update Problem in Linear Rectifiers: Thresholded Exponential Rectified Linear Units

In past few years, linear rectified unit activation functions have shown...
research
05/15/2023

ReLU soothes the NTK condition number and accelerates optimization for wide neural networks

Rectified linear unit (ReLU), as a non-linear activation function, is we...
research
12/02/2020

Neural Teleportation

In this paper, we explore a process called neural teleportation, a mathe...
research
11/24/2020

Comparisons among different stochastic selection of activation layers for convolutional neural networks for healthcare

Classification of biological images is an important task with crucial ap...
research
11/30/2021

Beyond Periodicity: Towards a Unifying Framework for Activations in Coordinate-MLPs

Coordinate-MLPs are emerging as an effective tool for modeling multidime...
research
01/14/2021

Neural networks behave as hash encoders: An empirical study

The input space of a neural network with ReLU-like activations is partit...

Please sign up or login with your details

Forgot password? Click here to reset