The structure of conservative gradient fields

01/03/2021
by   Adrian Lewis, et al.
0

The classical Clarke subdifferential alone is inadequate for understanding automatic differentiation in nonsmooth contexts. Instead, we can sometimes rely on enlarged generalized gradients called "conservative fields", defined through the natural path-wise chain rule: one application is the convergence analysis of gradient-based deep learning algorithms. In the semi-algebraic case, we show that all conservative fields are in fact just Clarke subdifferentials plus normals of manifolds in underlying Whitney stratifications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2019

Conservative set valued fields, automatic differentiation, stochastic gradient method and deep learning

The Clarke subdifferential is not suited to tackle nonsmooth deep learni...
research
05/31/2022

Automatic differentiation of nonsmooth iterative algorithms

Differentiation along algorithms, i.e., piggyback propagation of derivat...
research
02/11/2022

Conservative Extensions for Existential Rules

We study the problem to decide, given sets T1,T2 of tuple-generating dep...
research
09/08/2021

Iterated Vector Fields and Conservatism, with Applications to Federated Learning

We study iterated vector fields and investigate whether they are conserv...
research
07/09/2021

The Bayesian Learning Rule

We show that many machine-learning algorithms are specific instances of ...
research
01/11/2022

Path differentiability of ODE flows

We consider flows of ordinary differential equations (ODEs) driven by pa...
research
08/17/2023

Probabilistic Gradient-Based Extrema Tracking

Feature tracking is a common task in visualization applications, where m...

Please sign up or login with your details

Forgot password? Click here to reset