Liquid Time-constant Recurrent Neural Networks as Universal Approximators

11/01/2018
by   Ramin M. Hasani, et al.
4

In this paper, we introduce the notion of liquid time-constant (LTC) recurrent neural networks (RNN)s, a subclass of continuous-time RNNs, with varying neuronal time-constant realized by their nonlinear synaptic transmission model. This feature is inspired by the communication principles in the nervous system of small species. It enables the model to approximate continuous mapping with a small number of computational units. We show that any finite trajectory of an n-dimensional continuous dynamical system can be approximated by the internal state of the hidden units and n output units of an LTC network. Here, we also theoretically find bounds on their neuronal states and varying time-constant.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2018

Re-purposing Compact Neuronal Circuit Policies to Govern Reinforcement Learning Tasks

We propose an effective method for creating interpretable control agents...
research
03/29/2023

Learning Flow Functions from Data with Applications to Nonlinear Oscillators

We describe a recurrent neural network (RNN) based architecture to learn...
research
06/03/2019

Gated recurrent units viewed through the lens of continuous time dynamical systems

Gated recurrent units (GRUs) are specialized memory elements for buildin...
research
06/11/2021

Piecewise-constant Neural ODEs

Neural networks are a popular tool for modeling sequential data but they...
research
04/18/2023

LTC-SE: Expanding the Potential of Liquid Time-Constant Neural Networks for Scalable AI and Embedded Systems

We present LTC-SE, an improved version of the Liquid Time-Constant (LTC)...
research
11/08/2021

The Global Structure of Codimension-2 Local Bifurcations in Continuous-Time Recurrent Neural Networks

If we are ever to move beyond the study of isolated special cases in the...
research
02/26/2020

ResNets, NeuralODEs and CT-RNNs are Particular Neural Regulatory Networks

This paper shows that ResNets, NeuralODEs, and CT-RNNs, are particular n...

Please sign up or login with your details

Forgot password? Click here to reset