A Differential-form Pullback Programming Language for Higher-order Reverse-mode Automatic Differentiation

02/19/2020
by   Carol Mak, et al.
0

Building on the observation that reverse-mode automatic differentiation (AD) – a generalisation of backpropagation – can naturally be expressed as pullbacks of differential 1-forms, we design a simple higher-order programming language with a first-class differential operator, and present a reduction strategy which exactly simulates reverse-mode AD. We justify our reduction strategy by interpreting our language in any differential λ-category that satisfies the Hahn-Banach Separation Theorem, and show that the reduction strategy precisely captures reverse-mode AD in a truly higher-order setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2021

Decomposing reverse-mode automatic differentiation

We decompose reverse-mode automatic differentiation into (forward-mode) ...
research
11/10/2016

Efficient Implementation of a Higher-Order Language with Built-In AD

We show that Automatic Differentiation (AD) operators can be provided in...
research
03/27/2018

Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator

Deep learning has seen tremendous success over the past decade in comput...
research
12/14/2021

Verifying a Minimalist Reverse-Mode AD Library

By exploiting a number of relatively subtle programming language feature...
research
12/30/2018

A Geometric Theory of Higher-Order Automatic Differentiation

First-order automatic differentiation is a ubiquitous tool across statis...
research
04/02/2018

The simple essence of automatic differentiation

Automatic differentiation (AD) in reverse mode (RAD) is a central compon...
research
05/23/2022

Dual-Numbers Reverse AD, Efficiently

Where dual-numbers forward-mode automatic differentiation (AD) pairs eac...

Please sign up or login with your details

Forgot password? Click here to reset