A new activation for neural networks and its approximation

10/19/2022
by   Jianfei Li, et al.
0

Deep learning with deep neural networks (DNNs) has attracted tremendous attention from various fields of science and technology recently. Activation functions for a DNN define the output of a neuron given an input or set of inputs. They are essential and inevitable in learning non-linear transformations and performing diverse computations among successive neuron layers. Thus, the design of activation functions is still an important topic in deep learning research. Meanwhile, theoretical studies on the approximation ability of DNNs with activation functions have been investigated within the last few years. In this paper, we propose a new activation function, named as "DLU", and investigate its approximation ability for functions with various smoothness and structures. Our theoretical results show that DLU networks can process competitive approximation performance with rational and ReLU networks, and have some advantages. Numerical experiments are conducted comparing DLU with the existing activations-ReLU, Leaky ReLU, and ELU, which illustrate the good practical performance of DLU.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2020

Rational neural networks

We consider neural networks with rational activation functions. The choi...
research
01/13/2021

Reproducing Activation Function for Deep Learning

In this paper, we propose the reproducing activation function to improve...
research
09/11/2020

Abstract Neural Networks

Deep Neural Networks (DNNs) are rapidly being applied to safety-critical...
research
09/29/2021

A Comprehensive Survey and Performance Analysis of Activation Functions in Deep Learning

Neural networks have shown tremendous growth in recent years to solve nu...
research
11/03/2020

Analytical aspects of non-differentiable neural networks

Research in computational deep learning has directed considerable effort...
research
07/03/2023

Neural Polytopes

We find that simple neural networks with ReLU activation generate polyto...
research
10/28/2019

Growing axons: greedy learning of neural networks with application to function approximation

We propose a new method for learning deep neural network models that is ...

Please sign up or login with your details

Forgot password? Click here to reset