Sparsity in Continuous-Depth Neural Networks

10/26/2022
by   Hananeh Aliee, et al.
0

Neural Ordinary Differential Equations (NODEs) have proven successful in learning dynamical systems in terms of accurately recovering the observed trajectories. While different types of sparsity have been proposed to improve robustness, the generalization properties of NODEs for dynamical systems beyond the observed data are underexplored. We systematically study the influence of weight and feature sparsity on forecasting as well as on identifying the underlying dynamical laws. Besides assessing existing methods, we propose a regularization technique to sparsify "input-output connections" and extract relevant features during training. Moreover, we curate real-world datasets consisting of human motion capture and human hematopoiesis single-cell RNA-seq data to realistically analyze different levels of out-of-distribution (OOD) generalization in forecasting and dynamics identification respectively. Our extensive empirical evaluation on these challenging benchmarks suggests that weight sparsity improves generalization in the presence of noise or irregular sampling. However, it does not prevent learning spurious feature dependencies in the inferred dynamics, rendering them impractical for predictions under interventions, or for inferring the true underlying dynamics. Instead, feature sparsity can indeed help with recovering sparse ground-truth dynamics compared to unregularized NODEs.

READ FULL TEXT

page 18

page 19

page 20

page 21

research
02/22/2021

Neural Delay Differential Equations

Neural Ordinary Differential Equations (NODEs), a framework of continuou...
research
11/11/2022

SPADE4: Sparsity and Delay Embedding based Forecasting of Epidemics

Predicting the evolution of diseases is challenging, especially when the...
research
06/16/2023

Stabilized Neural Differential Equations for Learning Constrained Dynamics

Many successful methods to learn dynamical systems from data have recent...
research
06/26/2018

Tangent-Space Regularization for Neural-Network Models of Dynamical Systems

This work introduces the concept of tangent space regularization for neu...
research
07/22/2020

Using local dynamics to explain analog forecasting of chaotic systems

Analogs are nearest neighbors of the state of a system. By using analogs...
research
03/25/2022

Simultaneous Identification and Denoising of Dynamical Systems

In recent years there has been a push to discover the governing equation...
research
07/12/2023

Learning Stochastic Dynamical Systems as an Implicit Regularization with Graph Neural Networks

Stochastic Gumbel graph networks are proposed to learn high-dimensional ...

Please sign up or login with your details

Forgot password? Click here to reset