Demystifying Differentiable Programming: Shift/Reset the Penultimate Backpropagator

03/27/2018
by   Fei Wang, et al.
0

Deep learning has seen tremendous success over the past decade in computer vision, machine translation, and gameplay. This success rests in crucial ways on gradient-descent optimization and the ability to learn parameters of a neural network by backpropagating observed errors. However, neural network architectures are growing increasingly sophisticated and diverse, which motivates an emerging quest for even more general forms of differentiable programming, where arbitrary parameterized computations can be trained by gradient descent. In this paper, we take a fresh look at automatic differentiation (AD) techniques, and especially aim to demystify the reverse-mode form of AD that generalizes backpropagation in neural networks. We uncover a tight connection between reverse-mode AD and delimited continuations, which permits implementing reverse-mode AD purely via operator overloading and without any auxiliary data structures. We further show how this formulation of AD can be fruitfully combined with multi-stage programming (staging), leading to a highly efficient implementation that combines the performance benefits of deep learning frameworks based on explicit reified computation graphs (e.g., TensorFlow) with the expressiveness of pure library approaches (e.g., PyTorch).

READ FULL TEXT
research
02/19/2020

A Differential-form Pullback Programming Language for Higher-order Reverse-mode Automatic Differentiation

Building on the observation that reverse-mode automatic differentiation ...
research
11/10/2016

Tricks from Deep Learning

The deep learning community has devised a diverse set of methods to make...
research
05/23/2022

Dual-Numbers Reverse AD, Efficiently

Where dual-numbers forward-mode automatic differentiation (AD) pairs eac...
research
07/20/2020

Randomized Automatic Differentiation

The successes of deep learning, variational inference, and many other fi...
research
12/14/2021

Verifying a Minimalist Reverse-Mode AD Library

By exploiting a number of relatively subtle programming language feature...
research
04/02/2018

The simple essence of automatic differentiation (Differentiable functional programming made easy)

Automatic differentiation (AD) in reverse mode (RAD) is a central compon...
research
08/22/2017

Divide-and-Conquer Checkpointing for Arbitrary Programs with No User Annotation

Classical reverse-mode automatic differentiation (AD) imposes only a sma...

Please sign up or login with your details

Forgot password? Click here to reset