Finite-time Lyapunov exponents of deep neural networks

06/21/2023
by   L. Storm, et al.
0

We compute how small input perturbations affect the output of deep neural networks, exploring an analogy between deep networks and dynamical systems, where the growth or decay of local perturbations is characterised by finite-time Lyapunov exponents. We show that the maximal exponent forms geometrical structures in input space, akin to coherent structures in dynamical systems. Ridges of large positive exponents divide input space into different regions that the network associates with different classes. These ridges visualise the geometry that deep networks construct in input space, shedding light on the fundamental mechanisms underlying their learning capabilities.

READ FULL TEXT
research
10/05/2016

Nonlinear Systems Identification Using Deep Dynamic Neural Networks

Neural networks are known to be effective function approximators. Recent...
research
09/22/2021

Analysis of chaotic dynamical systems with autoencoders

We focus on chaotic dynamical systems and analyze their time series with...
research
12/24/2022

Within-Cluster Variability Exponent for Identifying Coherent Structures in Dynamical Systems

We propose a clustering-based approach for identifying coherent flow str...
research
11/01/2021

Deep neural networks as nested dynamical systems

There is an analogy that is often made between deep neural networks and ...
research
03/22/2022

On Robust Classification using Contractive Hamiltonian Neural ODEs

Deep neural networks can be fragile and sensitive to small input perturb...
research
07/06/2021

Dynamical System Parameter Identification using Deep Recurrent Cell Networks

In this paper, we investigate the parameter identification problem in dy...

Please sign up or login with your details

Forgot password? Click here to reset