Data-aware customization of activation functions reduces neural network error

01/16/2023
by   Fuchang Gao, et al.
0

Activation functions play critical roles in neural networks, yet current off-the-shelf neural networks pay little attention to the specific choice of activation functions used. Here we show that data-aware customization of activation functions can result in striking reductions in neural network error. We first give a simple linear algebraic explanation of the role of activation functions in neural networks; then, through connection with the Diaconis-Shahshahani Approximation Theorem, we propose a set of criteria for good activation functions. As a case study, we consider regression tasks with a partially exchangeable target function, i.e. f(u,v,w)=f(v,u,w) for u,v∈ℝ^d and w∈ℝ^k, and prove that for such a target function, using an even activation function in at least one of the layers guarantees that the prediction preserves partial exchangeability for best performance. Since even activation functions are seldom used in practice, we designed the “seagull” even activation function log(1+x^2) according to our criteria. Empirical testing on over two dozen 9-25 dimensional examples with different local smoothness, curvature, and degree of exchangeability revealed that a simple substitution with the “seagull” activation function in an already-refined neural network can lead to an order-of-magnitude reduction in error. This improvement was most pronounced when the activation function substitution was applied to the layer in which the exchangeable variables are connected for the first time. While the improvement is greatest for low-dimensional data, experiments on the CIFAR10 image classification dataset showed that use of “seagull” can reduce error even for high-dimensional cases. These results collectively highlight the potential of customizing activation functions as a general approach to improve neural network performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/28/2019

Activation Adaptation in Neural Networks

Many neural network architectures rely on the choice of the activation f...
research
11/23/2020

A Use of Even Activation Functions in Neural Networks

Despite broad interest in applying deep learning techniques to scientifi...
research
07/29/2021

Otimizacao de pesos e funcoes de ativacao de redes neurais aplicadas na previsao de series temporais

Neural Networks have been applied for time series prediction with good e...
research
06/29/2023

Why Shallow Networks Struggle with Approximating and Learning High Frequency: A Numerical Study

In this work, a comprehensive numerical study involving analysis and exp...
research
11/12/2022

MixBin: Towards Budgeted Binarization

Binarization has proven to be amongst the most effective ways of neural ...
research
09/07/2022

Parallel and Streaming Wavelet Neural Networks for Classification and Regression under Apache Spark

Wavelet neural networks (WNN) have been applied in many fields to solve ...

Please sign up or login with your details

Forgot password? Click here to reset