Beyond Occam's Razor in System Identification: Double-Descent when Modeling Dynamics

by   Antônio H. Ribeiro, et al.

System identification aims to build models of dynamical systems from data. Traditionally, choosing the model requires the designer to balance between two goals of conflicting nature; the model must be rich enough to capture the system dynamics, but not so flexible that it learns spurious random effects from the dataset. It is typically observed that model validation performance follows a U-shaped curve as the model complexity increases. Recent developments in machine learning and statistics, however, have observed situations where a "double-descent" curve subsumes this U-shaped model-performance curve. With a second decrease in performance occurring beyond the point where the model has reached the capacity of interpolating - i.e., (near) perfectly fitting - the training data. To the best of our knowledge, however, such phenomena have not been studied within the context of the identification of dynamic systems. The present paper aims to answer the question: "Can such a phenomenon also be observed when estimating parameters of dynamic systems?" We show the answer is yes, verifying such behavior experimentally both for artificially generated and real-world datasets.



There are no comments yet.


page 1

page 2

page 3

page 4


The generalization error of random features regression: Precise asymptotics and double descent curve

Deep learning methods operate in regimes that defy the traditional stati...

Reconciling modern machine learning and the bias-variance trade-off

The question of generalization in machine learning---how algorithms are ...

Multiple Descent: Design Your Own Generalization Curve

This paper explores the generalization loss of linear regression in vari...

Mitigating deep double descent by concatenating inputs

The double descent curve is one of the most intriguing properties of dee...

Do Deeper Convolutional Networks Perform Better?

Over-parameterization is a recent topic of much interest in the machine ...

Benefit of Interpolation in Nearest Neighbor Algorithms

The over-parameterized models attract much attention in the era of data ...

There is no Double-Descent in Random Forests

Random Forests (RFs) are among the state-of-the-art in machine learning ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.