Differentiating Metropolis-Hastings to Optimize Intractable Densities

06/13/2023
by   Gaurav Arya, et al.
0

We develop an algorithm for automatic differentiation of Metropolis-Hastings samplers, allowing us to differentiate through probabilistic inference, even if the model has discrete components within it. Our approach fuses recent advances in stochastic automatic differentiation with traditional Markov chain coupling schemes, providing an unbiased and low-variance gradient estimator. This allows us to apply gradient-based optimization to objectives expressed as expectations over intractable target densities. We demonstrate our approach by finding an ambiguous observation in a Gaussian mixture model and by maximizing the specific heat in an Ising model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/01/2021

Storchastic: A Framework for General Stochastic Automatic Differentiation

Modelers use automatic differentiation of computation graphs to implemen...
research
12/13/2018

Automatic Differentiation in Mixture Models

In this article, we discuss two specific classes of models - Gaussian Mi...
research
10/16/2022

Automatic Differentiation of Programs with Discrete Randomness

Automatic differentiation (AD), a technique for constructing new program...
research
01/07/2019

Credit Assignment Techniques in Stochastic Computation Graphs

Stochastic computation graphs (SCGs) provide a formalism to represent st...
research
06/28/2016

Automatic Variational ABC

Approximate Bayesian Computation (ABC) is a framework for performing lik...
research
03/01/2020

Inverse design of photonic crystals through automatic differentiation

Gradient-based inverse design in photonics has already achieved remarkab...
research
09/21/2022

Improved Marginal Unbiased Score Expansion (MUSE) via Implicit Differentiation

We apply the technique of implicit differentiation to boost performance,...

Please sign up or login with your details

Forgot password? Click here to reset