Reproducing Activation Function for Deep Learning

01/13/2021
by   Senwei Liang, et al.
0

In this paper, we propose the reproducing activation function to improve deep learning accuracy for various applications ranging from computer vision problems to scientific computing problems. The idea of reproducing activation functions is to employ several basic functions and their learnable linear combination to construct neuron-wise data-driven activation functions for each neuron. Armed with such activation functions, deep neural networks can reproduce traditional approximation tools and, therefore, approximate target functions with a smaller number of parameters than traditional neural networks. In terms of training dynamics of deep learning, reproducing activation functions can generate neural tangent kernels with a better condition number than traditional activation functions lessening the spectral bias of deep learning. As demonstrated by extensive numerical tests, the proposed activation function can facilitate the convergence of deep learning optimization for a solution with higher accuracy than existing deep learning solvers for audio/image/video reconstruction, PDEs, and eigenvalue problems.

READ FULL TEXT
research
09/10/2022

APTx: better activation function than MISH, SWISH, and ReLU's variants used in deep learning

Activation Functions introduce non-linearity in the deep neural networks...
research
08/02/2018

The Quest for the Golden Activation Function

Deep Neural Networks have been shown to be beneficial for a variety of t...
research
10/19/2022

A new activation for neural networks and its approximation

Deep learning with deep neural networks (DNNs) has attracted tremendous ...
research
11/23/2020

A Use of Even Activation Functions in Neural Networks

Despite broad interest in applying deep learning techniques to scientifi...
research
07/07/2020

An Integer Programming Approach to Deep Neural Networks with Binary Activation Functions

We study deep neural networks with binary activation functions (BDNN), i...
research
07/17/2018

Learning Neuron Non-Linearities with Kernel-Based Deep Neural Networks

The effectiveness of deep neural architectures has been widely supported...
research
09/17/2019

K-TanH: Hardware Efficient Activations For Deep Learning

We propose K-TanH, a novel, highly accurate, hardware efficient approxim...

Please sign up or login with your details

Forgot password? Click here to reset