You Only Linearize Once: Tangents Transpose to Gradients

04/22/2022
by   Alexey Radul, et al.
0

Automatic differentiation (AD) is conventionally understood as a family of distinct algorithms, rooted in two "modes" – forward and reverse – which are typically presented (and implemented) separately. Can there be only one? Following up on the AD systems developed in the JAX and Dex projects, we formalize a decomposition of reverse-mode AD into (i) forward-mode AD followed by (ii) unzipping the linear and non-linear parts and then (iii) transposition of the linear part. To that end, we use the technology of linear types to formalize a notion of structurally linear functions, which are then also algebraically linear. Our main results are that forward-mode AD produces structurally linear functions, and that we can unzip and transpose any structurally linear function, conserving cost, size, and structural linearity. Composing these three transformations recovers reverse-mode AD. This decomposition also sheds light on checkpointing, which emerges naturally from a free choice in unzipping let expressions. As a corollary, checkpointing techniques are applicable to general-purpose partial evaluation, not just AD. We hope that our formalization will lead to a deeper understanding of automatic differentiation and that it will simplify implementations, by separating the concerns of differentiation proper from the concerns of gaining efficiency (namely, separating the derivative computation from the act of running it backward).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2021

Decomposing reverse-mode automatic differentiation

We decompose reverse-mode automatic differentiation into (forward-mode) ...
research
07/13/2022

Automatic Differentiation: Theory and Practice

We present the classical coordinate-free formalism for forward and backw...
research
05/23/2022

Dual-Numbers Reverse AD, Efficiently

Where dual-numbers forward-mode automatic differentiation (AD) pairs eac...
research
08/05/2022

Fixed-Point Automatic Differentiation of Forward–Backward Splitting Algorithms for Partly Smooth Functions

A large class of non-smooth practical optimization problems can be writt...
research
01/05/2016

DrMAD: Distilling Reverse-Mode Automatic Differentiation for Optimizing Hyperparameters of Deep Neural Networks

The performance of deep neural networks is well-known to be sensitive to...
research
11/10/2016

Binomial Checkpointing for Arbitrary Programs with No User Annotation

Heretofore, automatic checkpointing at procedure-call boundaries, to red...
research
10/18/2018

Dynamic Automatic Differentiation of GPU Broadcast Kernels

We show how forward-mode automatic differentiation (AD) can be employed ...

Please sign up or login with your details

Forgot password? Click here to reset