On the stability properties of Gated Recurrent Units neural networks

11/13/2020
by   Fabio Bonassi, et al.
0

The goal of this paper is to provide sufficient conditions for guaranteeing the Input-to-State Stability (ISS) and the Incremental Input-to-State Stability (δISS) of Gated Recurrent Units (GRUs) neural networks. These conditions, devised for both single-layer and multi-layer architectures, consist of nonlinear inequalities on network's weights. They can be employed to check the stability of trained networks, or can be enforced as constraints during the training procedure of a GRU. The resulting training procedure is tested on a Quadruple Tank nonlinear benchmark system, showing satisfactory modeling performances.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/30/2023

On the Stability of Gated Graph Neural Networks

In this paper, we aim to find the conditions for input-state stability (...
research
03/03/2021

Nonlinear MPC for Offset-Free Tracking of systems learned by GRU Neural Networks

The use of Recurrent Neural Networks (RNNs) for system identification ha...
research
08/10/2021

Recurrent neural network-based Internal Model Control of unknown nonlinear stable systems

Owing to their superior modeling capabilities, gated Recurrent Neural Ne...
research
02/16/2023

Stabilising and accelerating light gated recurrent units for automatic speech recognition

The light gated recurrent units (Li-GRU) is well-known for achieving imp...
research
03/15/2023

Gated Compression Layers for Efficient Always-On Models

Mobile and embedded machine learning developers frequently have to compr...
research
11/08/2021

On the Stochastic Stability of Deep Markov Models

Deep Markov models (DMM) are generative models that are scalable and exp...
research
06/23/2020

Lipschitz Recurrent Neural Networks

Differential equations are a natural choice for modeling recurrent neura...

Please sign up or login with your details

Forgot password? Click here to reset