Rational neural networks

04/04/2020
by   Nicolas Boullé, et al.
33

We consider neural networks with rational activation functions. The choice of the nonlinear activation function in deep learning architectures is crucial and heavily impacts the performance of a neural network. We establish optimal bounds in terms of network complexity and prove that rational neural networks approximate smooth functions more efficiently than ReLU networks. The flexibility and smoothness of rational activation functions make them an attractive alternative to ReLU, as we demonstrate with numerical experiments.

READ FULL TEXT

page 6

page 8

page 18

page 20

research
11/08/2021

SMU: smooth activation function for deep networks using smoothing maximum technique

Deep learning researchers have a keen interest in proposing two new nove...
research
10/19/2022

A new activation for neural networks and its approximation

Deep learning with deep neural networks (DNNs) has attracted tremendous ...
research
06/11/2017

Neural networks and rational functions

Neural networks and rational functions efficiently approximate each othe...
research
02/18/2021

Recurrent Rational Networks

Latest insights from biology show that intelligence does not only emerge...
research
05/18/2018

Tropical Geometry of Deep Neural Networks

We establish, for the first time, connections between feedforward neural...
research
07/12/2023

Rational Neural Network Controllers

Neural networks have shown great success in many machine learning relate...
research
05/22/2023

DeepBern-Nets: Taming the Complexity of Certifying Neural Networks using Bernstein Polynomial Activations and Precise Bound Propagation

Formal certification of Neural Networks (NNs) is crucial for ensuring th...

Please sign up or login with your details

Forgot password? Click here to reset