AbstractDifferentiation.jl: Backend-Agnostic Differentiable Programming in Julia

09/25/2021
by   Frank Schäfer, et al.
13

No single Automatic Differentiation (AD) system is the optimal choice for all problems. This means informed selection of an AD system and combinations can be a problem-specific variable that can greatly impact performance. In the Julia programming language, the major AD systems target the same input and thus in theory can compose. Hitherto, switching between AD packages in the Julia Language required end-users to familiarize themselves with the user-facing API of the respective packages. Furthermore, implementing a new, usable AD package required AD package developers to write boilerplate code to define convenience API functions for end-users. As a response to these issues, we present AbstractDifferentiation.jl for the automatized generation of an extensive, unified, user-facing API for any AD package. By splitting the complexity between AD users and AD developers, AD package developers only need to implement one or two primitive definitions to support various utilities for AD users like Jacobians, Hessians and lazy product operators from native primitives such as pullbacks or pushforwards, thus removing tedious – but so far inevitable – boilerplate code, and enabling the easy switching and composing between AD implementations for end-users.

READ FULL TEXT
research
11/10/2016

Efficient Implementation of a Higher-Order Language with Built-In AD

We show that Automatic Differentiation (AD) operators can be provided in...
research
11/07/2017

Tangent: Automatic Differentiation Using Source Code Transformation in Python

Automatic differentiation (AD) is an essential primitive for machine lea...
research
03/06/2020

SpellBound: Defending Against Package Typosquatting

Package managers for software repositories based on a single programming...
research
11/15/2017

Programming Bots by Synthesizing Natural Language Expressions into API Invocations

At present, bots are still in their preliminary stages of development. M...
research
04/02/2018

The simple essence of automatic differentiation (Differentiable functional programming made easy)

Automatic differentiation (AD) in reverse mode (RAD) is a central compon...
research
08/02/2020

A Unifying Framework for Parallel and Distributed Processing in R using Futures

A future is a programming construct designed for concurrent and asynchro...
research
11/09/2021

Computing Sparse Jacobians and Hessians Using Algorithmic Differentiation

Stochastic scientific models and machine learning optimization estimator...

Please sign up or login with your details

Forgot password? Click here to reset