Recurrent Rational Networks

02/18/2021
by   Quentin Delfosse, et al.
15

Latest insights from biology show that intelligence does not only emerge from the connections between the neurons, but that individual neurons shoulder more computational responsibility. Current Neural Network architecture design and search are biased on fixed activation functions. Using more advanced learnable activation functions provide Neural Networks with higher learning capacity. However, general guidance for building such networks is still missing. In this work, we first explain why rationals offer an optimal choice for activation functions. We then show that they are closed under residual connections, and inspired by recurrence for residual networks we derive a self-regularized version of Rationals: Recurrent Rationals. We demonstrate that (Recurrent) Rational Networks lead to high performance improvements on Image Classification and Deep Reinforcement Learning.

READ FULL TEXT

page 1

page 5

page 14

research
04/04/2020

Rational neural networks

We consider neural networks with rational activation functions. The choi...
research
05/21/2021

Maximum and Leaky Maximum Propagation

In this work, we present an alternative to conventional residual connect...
research
03/29/2021

Translating Numerical Concepts for PDEs into Neural Architectures

We investigate what can be learned from translating numerical algorithms...
research
06/29/2022

Automatic Synthesis of Neurons for Recurrent Neural Nets

We present a new class of neurons, ARNs, which give a cross entropy on t...
research
04/14/2023

The R-mAtrIx Net

We provide a novel Neural Network architecture that can: i) output R-mat...
research
11/29/2017

Gaussian Process Neurons Learn Stochastic Activation Functions

We propose stochastic, non-parametric activation functions that are full...
research
07/12/2023

Rational Neural Network Controllers

Neural networks have shown great success in many machine learning relate...

Please sign up or login with your details

Forgot password? Click here to reset