Forward- or Reverse-Mode Automatic Differentiation: What's the Difference?

12/21/2022
by   Birthe van den Berg, et al.
0

Automatic differentiation (AD) has been a topic of interest for researchers in many disciplines, with increased popularity since its application to machine learning and neural networks. Although many researchers appreciate and know how to apply AD, it remains a challenge to truly understand the underlying processes. From an algebraic point of view, however, AD appears surprisingly natural: it originates from the differentiation laws. In this work we use Algebra of Programming techniques to reason about different AD variants, leveraging Haskell to illustrate our observations. Our findings stem from three fundamental algebraic abstractions: (1) the notion of module over a semiring, (2) Nagata's construction of the 'idealization of a module', and (3) Kronecker's delta function, that together allow us to write a single-line abstract definition of AD. From this single-line definition, and by instantiating our algebraic structures in various ways, we derive different AD variants, that have the same extensional behaviour, but different intensional properties, mainly in terms of (asymptotic) computational complexity. We show the different variants equivalent by means of Kronecker isomorphisms, a further elaboration of our Haskell infrastructure which guarantees correctness by construction. With this framework in place, this paper seeks to make AD variants more comprehensible, taking an algebraic perspective on the matter.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/07/2020

Correctness of Automatic Differentiation via Diffeologies and Categorical Gluing

We present semantic correctness proofs of Automatic Differentiation (AD)...
research
05/13/2023

Automatic Differentiation in Prolog

Automatic differentiation (AD) is a range of algorithms to compute the n...
research
01/17/2021

Higher Order Automatic Differentiation of Higher Order Functions

We present semantic correctness proofs of automatic differentiation (AD)...
research
06/27/2021

Automatic Differentiation With Higher Infinitesimals, or Computational Smooth Infinitesimal Analysis in Weil Algebra

We propose an algorithm to compute the C^∞-ring structure of arbitrary W...
research
09/04/2019

Automatic Differentiation for Complex Valued SVD

In this note, we report the back propagation formula for complex valued ...
research
07/10/2020

Denotational Correctness of Foward-Mode Automatic Differentiation for Iteration and Recursion

We present semantic correctness proofs of forward-mode Automatic Differe...
research
11/10/2016

Tricks from Deep Learning

The deep learning community has devised a diverse set of methods to make...

Please sign up or login with your details

Forgot password? Click here to reset