DeepAI AI Chat
Log In Sign Up

Path differentiability of ODE flows

by   Swann Marx, et al.

We consider flows of ordinary differential equations (ODEs) driven by path differentiable vector fields. Path differentiable functions constitute a proper subclass of Lipschitz functions which admit conservative gradients, a notion of generalized derivative compatible with basic calculus rules. Our main result states that such flows inherit the path differentiability property of the driving vector field. We show indeed that forward propagation of derivatives given by the sensitivity differential inclusions provide a conservative Jacobian for the flow. This allows to propose a nonsmooth version of the adjoint method, which can be applied to integral costs under an ODE constraint. This result constitutes a theoretical ground to the application of small step first order methods to solve a broad class of nonsmooth optimization problems with parametrized ODE constraints. This is illustrated with the convergence of small step first order methods based on the proposed nonsmooth adjoint.


page 1

page 2

page 3

page 4


Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning

The Clarke subdifferential is not suited to tackle nonsmooth deep learni...

Discretization by euler's method for regular lagrangian flow

This paper is concerned with the numerical analysis of the explicit Eule...

Second-order flows for computing the ground states of rotating Bose-Einstein condensates

Second-order flows in this paper refer to some artificial evolutionary d...

A Technical Note: Two-Step PECE Methods for Approximating Solutions To First- and Second-Order ODEs

Two-step predictor/corrector methods are provided to solve three classes...

A Forward Propagation Algorithm for Online Optimization of Nonlinear Stochastic Differential Equations

Optimizing over the stationary distribution of stochastic differential e...

The structure of conservative gradient fields

The classical Clarke subdifferential alone is inadequate for understandi...

Gradients should stay on Path: Better Estimators of the Reverse- and Forward KL Divergence for Normalizing Flows

We propose an algorithm to estimate the path-gradient of both the revers...