Towards Denotational Semantics of AD for Higher-Order, Recursive, Probabilistic Languages

11/30/2021
by   Alexander K. Lew, et al.
0

Automatic differentiation (AD) aims to compute derivatives of user-defined functions, but in Turing-complete languages, this simple specification does not fully capture AD's behavior: AD sometimes disagrees with the true derivative of a differentiable program, and when AD is applied to non-differentiable or effectful programs, it is unclear what guarantees (if any) hold of the resulting code. We study an expressive differentiable programming language, with piecewise-analytic primitives, higher-order functions, and general recursion. Our main result is that even in this general setting, a version of Lee et al. [2020]'s correctness theorem (originally proven for a first-order language without partiality or recursion) holds: all programs denote so-called ωPAP functions, and AD computes correct intensional derivatives of them. Mazza and Pagani [2021]'s recent theorem, that AD disagrees with the true derivative of a differentiable recursive program at a measure-zero set of inputs, can be derived as a straightforward corollary of this fact. We also apply the framework to study probabilistic programs, and recover a recent result from Mak et al. [2021] via a novel denotational argument.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/17/2021

Higher Order Automatic Differentiation of Higher Order Functions

We present semantic correctness proofs of automatic differentiation (AD)...
research
11/06/2020

Automatic Differentiation in PCF

We study the correctness of automatic differentiation (AD) in the contex...
research
04/08/2020

Densities of almost-surely terminating probabilistic programs are differentiable almost everywhere

We study the differential properties of higher-order statistical probabi...
research
07/15/2020

λ_S: Computable semantics for differentiable programming with higher-order functions and datatypes

Deep learning is moving towards increasingly sophisticated optimization ...
research
06/12/2020

On Correctness of Automatic Differentiation for Non-Differentiable Functions

Differentiation lies at the core of many machine-learning algorithms, an...
research
05/11/2023

Differentiable Programming: Efficient Smoothing of Control-Flow-Induced Discontinuities

We want to obtain derivatives in discontinuous program code, where defau...
research
02/21/2023

ωPAP Spaces: Reasoning Denotationally About Higher-Order, Recursive Probabilistic and Differentiable Programs

We introduce a new setting, the category of ωPAP spaces, for reasoning d...

Please sign up or login with your details

Forgot password? Click here to reset