Regularized Flexible Activation Function Combinations for Deep Neural Networks

07/26/2020
by   Renlong Jie, et al.
6

Activation in deep neural networks is fundamental to achieving non-linear mappings. Traditional studies mainly focus on finding fixed activations for a particular set of learning tasks or model architectures. The research on flexible activation is quite limited in both designing philosophy and application scenarios. In this study, three principles of choosing flexible activation components are proposed and a general combined form of flexible activation functions is implemented. Based on this, a novel family of flexible activation functions that can replace sigmoid or tanh in LSTM cells are implemented, as well as a new family by combining ReLU and ELUs. Also, two new regularisation terms based on assumptions as prior knowledge are introduced. It has been shown that LSTM models with proposed flexible activations P-Sig-Ramp provide significant improvements in time series forecasting, while the proposed P-E2-ReLU achieves better and more stable performance on lossy image compression tasks with convolutional auto-encoders. In addition, the proposed regularization terms improve the convergence, performance and stability of the models with flexible activation functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/28/2020

EIS – a family of activation functions combining Exponential, ISRU, and Softplus

Activation functions play a pivotal role in the function learning using ...
research
09/08/2020

TanhSoft – a family of activation functions combining Tanh and Softplus

Deep learning at its core, contains functions that are composition of a ...
research
01/09/2019

Is it Time to Swish? Comparing Deep Learning Activation Functions Across NLP tasks

Activation functions play a crucial role in neural networks because they...
research
06/24/2019

Variations on the Chebyshev-Lagrange Activation Function

We seek to improve the data efficiency of neural networks and present no...
research
10/19/2020

Stationary Activations for Uncertainty Calibration in Deep Learning

We introduce a new family of non-linear neural network activation functi...
research
07/15/2019

Padé Activation Units: End-to-end Learning of Flexible Activation Functions in Deep Networks

The performance of deep network learning strongly depends on the choice ...
research
07/17/2018

Learning Neuron Non-Linearities with Kernel-Based Deep Neural Networks

The effectiveness of deep neural architectures has been widely supported...

Please sign up or login with your details

Forgot password? Click here to reset