Layer-level activation mechanism

06/08/2023
by   Kihyuk Yoon, et al.
0

In this work, we propose a novel activation mechanism aimed at establishing layer-level activation (LayerAct) functions. These functions are designed to be more noise-robust compared to traditional element-level activation functions by reducing the layer-level fluctuation of the activation outputs due to shift in inputs. Moreover, the LayerAct functions achieve a zero-like mean activation output without restricting the activation output space. We present an analysis and experiments demonstrating that LayerAct functions exhibit superior noise-robustness compared to element-level activation functions, and empirically show that these functions have a zero-like mean activation. Experimental results on three benchmark image classification tasks show that LayerAct functions excel in handling noisy image datasets, outperforming element-level activation functions, while the performance on clean datasets is also superior in most cases.

READ FULL TEXT
research
09/12/2017

Shifting Mean Activation Towards Zero with Bipolar Activation Functions

We propose a simple extension to the ReLU-family of activation functions...
research
02/24/2017

Activation Ensembles for Deep Neural Networks

Many activation functions have been proposed in the past, but selecting ...
research
05/31/2022

Optimal Activation Functions for the Random Features Regression Model

The asymptotic mean squared test error and sensitivity of the Random Fea...
research
04/06/2020

Evolving Normalization-Activation Layers

Normalization layers and activation functions are critical components in...
research
01/01/2022

The GatedTabTransformer. An enhanced deep learning architecture for tabular modeling

There is an increasing interest in the application of deep learning arch...
research
09/16/2020

m-arcsinh: An Efficient and Reliable Function for SVM and MLP in scikit-learn

This paper describes the 'm-arcsinh', a modified ('m-') version of the i...
research
02/24/2022

Improving Robustness of Convolutional Neural Networks Using Element-Wise Activation Scaling

Recent works reveal that re-calibrating the intermediate activation of a...

Please sign up or login with your details

Forgot password? Click here to reset