Forward-Mode Automatic Differentiation in Julia

07/26/2016
by   Jarrett Revels, et al.
0

We present ForwardDiff, a Julia package for forward-mode automatic differentiation (AD) featuring performance competitive with low-level languages like C++. Unlike recently developed AD tools in other popular high-level languages such as Python and MATLAB, ForwardDiff takes advantage of just-in-time (JIT) compilation to transparently recompile AD-unaware user code, enabling efficient support for higher-order differentiation and differentiation using custom number types (including complex numbers). For gradient and Jacobian calculations, ForwardDiff provides a variant of vector-forward mode that avoids expensive heap allocation and makes better use of memory bandwidth than traditional vector mode. In our numerical experiments, we demonstrate that for nontrivially large dimensions, ForwardDiff's gradient computations can be faster than a reverse-mode implementation from the Python-based autograd package. We also illustrate how ForwardDiff is used effectively within JuMP, a modeling language for optimization. According to our usage statistics, 41 unique repositories on GitHub depend on ForwardDiff, with users from diverse fields such as astronomy, optimization, finite element analysis, and statistics. This document is an extended abstract that has been accepted for presentation at the AD2016 7th International Conference on Algorithmic Differentiation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2021

Decomposing reverse-mode automatic differentiation

We decompose reverse-mode automatic differentiation into (forward-mode) ...
research
09/25/2018

Tangent: Automatic differentiation using source-code transformation for dynamically typed array programming

The need to efficiently calculate first- and higher-order derivatives of...
research
05/16/2023

On the implementation of checkpointing with high-level algorithmic differentiation

Automated code generation allows for a separation between the developmen...
research
11/10/2016

Tricks from Deep Learning

The deep learning community has devised a diverse set of methods to make...
research
06/27/2021

Automatic Differentiation With Higher Infinitesimals, or Computational Smooth Infinitesimal Analysis in Weil Algebra

We propose an algorithm to compute the C^∞-ring structure of arbitrary W...
research
07/13/2022

Automatic Differentiation: Theory and Practice

We present the classical coordinate-free formalism for forward and backw...
research
08/05/2022

Fixed-Point Automatic Differentiation of Forward–Backward Splitting Algorithms for Partly Smooth Functions

A large class of non-smooth practical optimization problems can be writt...

Please sign up or login with your details

Forgot password? Click here to reset