Lipschitz Recurrent Neural Networks

06/23/2020
by   N. Benjamin Erichson, et al.
5

Differential equations are a natural choice for modeling recurrent neural networks because they can be viewed as dynamical systems with a driving input. In this work, we propose a recurrent unit that describes the hidden state's evolution with two parts: a well-understood linear component plus a Lipschitz nonlinearity. This particular functional form simplifies stability analysis, which enables us to provide an asymptotic stability guarantee. Further, we demonstrate that Lipschitz recurrent units are more robust with respect to perturbations. We evaluate our approach on a range of benchmark tasks, and we show it outperforms existing recurrent units.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2021

Ergodic dynamical systems over the Cartesian power of the ring of p-adic integers

For any 1-lipschitz ergodic map F: ℤ^k_p↦ℤ^k_p, k>1∈ℕ, there are 1-lipsc...
research
04/13/2021

Recurrent Equilibrium Networks: Unconstrained Learning of Stable and Robust Dynamical Models

This paper introduces recurrent equilibrium networks (RENs), a new class...
research
05/25/2018

When Recurrent Models Don't Need To Be Recurrent

We prove stable recurrent neural networks are well approximated by feed-...
research
11/27/2018

Chasing the Echo State Property

Reservoir Computing (RC) provides an efficient way for designing dynamic...
research
06/03/2019

Gated recurrent units viewed through the lens of continuous time dynamical systems

Gated recurrent units (GRUs) are specialized memory elements for buildin...
research
11/13/2020

On the stability properties of Gated Recurrent Units neural networks

The goal of this paper is to provide sufficient conditions for guarantee...
research
07/21/2022

Bayesian Recurrent Units and the Forward-Backward Algorithm

Using Bayes's theorem, we derive a unit-wise recurrence as well as a bac...

Please sign up or login with your details

Forgot password? Click here to reset