ResNets, NeuralODEs and CT-RNNs are Particular Neural Regulatory Networks

02/26/2020
by   Radu Grosu, et al.
0

This paper shows that ResNets, NeuralODEs, and CT-RNNs, are particular neural regulatory networks (NRNs), a biophysical model for the nonspiking neurons encountered in small species, such as the C.elegans nematode, and in the retina of large species. Compared to ResNets, NeuralODEs and CT-RNNs, NRNs have an additional multiplicative term in their synaptic computation, allowing them to adapt to each particular input. This additional flexibility makes NRNs M times more succinct than NeuralODEs and CT-RNNs, where M is proportional to the size of the training set. Moreover, as NeuralODEs and CT-RNNs are N times more succinct than ResNets, where N is the number of integration steps required to compute the output F(x) for a given input x, NRNs are in total M · N more succinct than ResNets. For a given approximation task, this considerable succinctness allows to learn a very small and therefore understandable NRN, whose behavior can be explained in terms of well established architectural motifs, that NRNs share with gene regulatory networks, such as, activation, inhibition, sequentialization, mutual exclusion, and synchronization. To the best of our knowledge, this paper unifies for the first time the mainstream work on deep neural networks with the one in biology and neuroscience in a quantitative fashion.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/12/2017

Deep Echo State Network (DeepESN): A Brief Survey

The study of deep recurrent neural networks (RNNs) and, in particular, o...
research
09/04/2023

Gated recurrent neural networks discover attention

Recent architectural developments have enabled recurrent neural networks...
research
05/13/2018

On the Practical Computational Power of Finite Precision RNNs for Language Recognition

While Recurrent Neural Networks (RNNs) are famously known to be Turing c...
research
11/01/2018

Liquid Time-constant Recurrent Neural Networks as Universal Approximators

In this paper, we introduce the notion of liquid time-constant (LTC) rec...
research
04/20/2020

Towards deep neural network compression via learnable wavelet transforms

Wavelets are well known for data compression, yet have rarely been appli...
research
04/10/2022

Edge Continual Learning for Dynamic Digital Twins over Wireless Networks

Digital twins (DTs) constitute a critical link between the real-world an...

Please sign up or login with your details

Forgot password? Click here to reset