Nonsmooth Implicit Differentiation for Machine Learning and Optimization

06/08/2021
by   Jérôme Bolte, et al.
0

In view of training increasingly complex learning architectures, we establish a nonsmooth implicit function theorem with an operational calculus. Our result applies to most practical problems (i.e., definable problems) provided that a nonsmooth form of the classical invertibility condition is fulfilled. This approach allows for formal subdifferentiation: for instance, replacing derivatives by Clarke Jacobians in the usual differentiation formulas is fully justified for a wide class of nonsmooth problems. Moreover this calculus is entirely compatible with algorithmic differentiation (e.g., backpropagation). We provide several applications such as training deep equilibrium networks, training neural nets with conic optimization layers, or hyperparameter-tuning for nonsmooth Lasso-type models. To show the sharpness of our assumptions, we present numerical experiments showcasing the extremely pathological gradient dynamics one can encounter when applying implicit algorithmic differentiation without any hypothesis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/31/2021

Efficient and Modular Implicit Differentiation

Automatic differentiation (autodiff) has revolutionized machine learning...
research
11/09/2022

Approximate backwards differentiation of gradient flow

The gradient flow (GF) is an ODE for which its explicit Euler's discreti...
research
05/06/2022

Beyond backpropagation: implicit gradients for bilevel optimization

This paper reviews gradient-based techniques to solve bilevel optimizati...
research
12/15/2022

Differentiating Nonsmooth Solutions to Parametric Monotone Inclusion Problems

We leverage path differentiability and a recent result on nonsmooth impl...
research
02/20/2020

Implicit differentiation of Lasso-type models for hyperparameter optimization

Setting regularization parameters for Lasso-type estimators is notorious...
research
01/31/2022

Differentiating and Integrating ZX Diagrams

ZX-calculus has proved to be a useful tool for quantum technology with a...
research
07/02/2022

Object Representations as Fixed Points: Training Iterative Refinement Algorithms with Implicit Differentiation

Iterative refinement – start with a random guess, then iteratively impro...

Please sign up or login with your details

Forgot password? Click here to reset