NARX Identification using Derivative-Based Regularized Neural Networks

04/12/2022
by   L. H. Peeters, et al.
0

This work presents a novel regularization method for the identification of Nonlinear Autoregressive eXogenous (NARX) models. The regularization method promotes the exponential decay of the influence of past input samples on the current model output. This is done by penalizing the sensitivity (i.e. partial derivative) of the NARX model simulated output with respect to the past inputs. The effectiveness of the approach is demonstrated through a simulation example, where a neural network NARX model is identified with this novel method. Moreover, it is shown that the proposed regularization approach improves the model accuracy in terms of simulation error performance compared to that of other regularization methods and model classes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/25/2020

Volumization as a Natural Generalization of Weight Decay

We propose a novel regularization method, called volumization, for neura...
research
04/16/2011

Adding noise to the input of a model trained with a regularized objective

Regularization is a well studied problem in the context of neural networ...
research
10/29/2018

Three Mechanisms of Weight Decay Regularization

Weight decay is one of the standard tricks in the neural network toolbox...
research
02/12/2019

Derivative-based global sensitivity analysis for models with high-dimensional inputs and functional outputs

We present a framework for derivative-based global sensitivity analysis ...
research
05/20/2019

A novel Multiplicative Polynomial Kernel for Volterra series identification

Volterra series are especially useful for nonlinear system identificatio...
research
11/29/2019

Model structures and fitting criteria for system identification with neural networks

This paper focuses on the identification of dynamical systems with tailo...
research
10/28/2018

Learning Sparse Neural Networks via Sensitivity-Driven Regularization

The ever-increasing number of parameters in deep neural networks poses c...

Please sign up or login with your details

Forgot password? Click here to reset