ADEV: Sound Automatic Differentiation of Expected Values of Probabilistic Programs

12/13/2022
by   Alexander K. Lew, et al.
0

Optimizing the expected values of probabilistic processes is a central problem in computer science and its applications, arising in fields ranging from artificial intelligence to operations research to statistical computing. Unfortunately, automatic differentiation techniques developed for deterministic programs do not in general compute the correct gradients needed for widely used solutions based on gradient-based optimization. In this paper, we present ADEV, an extension to forward-mode AD that correctly differentiates the expectations of probabilistic processes represented as programs that make random choices. Our algorithm is a source-to-source program transformation on an expressive, higher-order language for probabilistic computation, with both discrete and continuous probability distributions. The result of our transformation is a new probabilistic program, whose expected return value is the derivative of the original program's expectation. This output program can be run to generate unbiased Monte Carlo estimates of the desired gradient, which can then be used within the inner loop of stochastic gradient descent. We prove ADEV correct using logical relations over the denotations of the source and target probabilistic programs. Because it modularly extends forward-mode AD, our algorithm lends itself to a concise implementation strategy, which we exploit to develop a prototype in just a few dozen lines of Haskell (https://github.com/probcomp/adev).

READ FULL TEXT
research
10/16/2022

Automatic Differentiation of Programs with Discrete Randomness

Automatic differentiation (AD), a technique for constructing new program...
research
12/20/2022

Efficient and Sound Differentiable Programming in a Functional Array-Processing Language

Automatic differentiation (AD) is a technique for computing the derivati...
research
05/13/2023

Automatic Differentiation in Prolog

Automatic differentiation (AD) is a range of algorithms to compute the n...
research
02/13/2019

Proving Expected Sensitivity of Probabilistic Programs with Randomized Execution Time

The notion of program sensitivity (aka Lipschitz continuity) specifies t...
research
02/21/2023

ωPAP Spaces: Reasoning Denotationally About Higher-Order, Recursive Probabilistic and Differentiable Programs

We introduce a new setting, the category of ωPAP spaces, for reasoning d...
research
07/20/2020

Randomized Automatic Differentiation

The successes of deep learning, variational inference, and many other fi...
research
08/31/2023

Branches of a Tree: Taking Derivatives of Programs with Discrete and Branching Randomness in High Energy Physics

We propose to apply several gradient estimation techniques to enable the...

Please sign up or login with your details

Forgot password? Click here to reset