Adaptive Blending Units: Trainable Activation Functions for Deep Neural Networks

06/26/2018
by   Leon René Sütfeld, et al.
0

The most widely used activation functions in current deep feed-forward neural networks are rectified linear units (ReLU), and many alternatives have been successfully applied, as well. However, none of the alternatives have managed to consistently outperform the rest and there is no unified theory connecting properties of the task and network with properties of activation functions for most efficient training. A possible solution is to have the network learn its preferred activation functions. In this work, we introduce Adaptive Blending Units (ABUs), a trainable linear combination of a set of activation functions. Since ABUs learn the shape, as well as the overall scaling of the activation function, we also analyze the effects of adaptive scaling in common activation functions. We experimentally demonstrate advantages of both adaptive scaling and ABUs over common activation functions across a set of systematically varied network specifications. We further show that adaptive scaling works by mitigating covariate shifts during training, and that the observed advantages in performance of ABUs likewise rely largely on the activation function's ability to adapt over the course of training.

READ FULL TEXT
research
06/02/2023

ErfReLU: Adaptive Activation Function for Deep Neural Network

Recent research has found that the activation function (AF) selected for...
research
05/20/2021

Deep Kronecker neural networks: A general framework for neural networks with adaptive activation functions

We propose a new type of neural networks, Kronecker neural networks (KNN...
research
10/05/2014

Understanding Locally Competitive Networks

Recently proposed neural network activation functions such as rectified ...
research
07/02/2023

ENN: A Neural Network with DCT-Adaptive Activation Functions

The expressiveness of neural networks highly depends on the nature of th...
research
10/15/2020

Review and Comparison of Commonly Used Activation Functions for Deep Neural Networks

The primary neural networks decision-making units are activation functio...
research
04/25/2022

Trainable Compound Activation Functions for Machine Learning

Activation functions (AF) are necessary components of neural networks th...
research
03/16/2022

Adaptive n-ary Activation Functions for Probabilistic Boolean Logic

Balancing model complexity against the information contained in observed...

Please sign up or login with your details

Forgot password? Click here to reset