DeepAI AI Chat
Log In Sign Up

A Differential-form Pullback Programming Language for Higher-order Reverse-mode Automatic Differentiation

02/19/2020
by   Carol Mak, et al.
0

Building on the observation that reverse-mode automatic differentiation (AD) – a generalisation of backpropagation – can naturally be expressed as pullbacks of differential 1-forms, we design a simple higher-order programming language with a first-class differential operator, and present a reduction strategy which exactly simulates reverse-mode AD. We justify our reduction strategy by interpreting our language in any differential λ-category that satisfies the Hahn-Banach Separation Theorem, and show that the reduction strategy precisely captures reverse-mode AD in a truly higher-order setting.

READ FULL TEXT

page 1

page 2

page 3

page 4

05/20/2021

Decomposing reverse-mode automatic differentiation

We decompose reverse-mode automatic differentiation into (forward-mode) ...
11/10/2016

Efficient Implementation of a Higher-Order Language with Built-In AD

We show that Automatic Differentiation (AD) operators can be provided in...
03/27/2018

Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator

Deep learning has seen tremendous success over the past decade in comput...
12/14/2021

Verifying a Minimalist Reverse-Mode AD Library

By exploiting a number of relatively subtle programming language feature...
04/02/2018

The simple essence of automatic differentiation

Automatic differentiation (AD) in reverse mode (RAD) is a central compon...
04/02/2018

The simple essence of automatic differentiation (Differentiable functional programming made easy)

Automatic differentiation (AD) in reverse mode (RAD) is a central compon...
05/23/2022

Dual-Numbers Reverse AD, Efficiently

Where dual-numbers forward-mode automatic differentiation (AD) pairs eac...