Composing Modeling and Inference Operations with Probabilistic Program Combinators

11/14/2018
by   Eli Sennesh, et al.
2

Probabilistic programs with dynamic computation graphs can define measures over sample spaces with unbounded dimensionality, and thereby constitute programmatic analogues to Bayesian nonparametrics. Owing to the generality of this model class, inference relies on "black-box" Monte Carlo methods that are generally not able to take advantage of conditional independence and exchangeability, which have historically been the cornerstones of efficient inference. We here seek to develop a "middle ground" between probabilistic models with fully dynamic and fully static computation graphs. To this end, we introduce a combinator library for the Probabilistic Torch framework. Combinators are functions that accept models and return transformed models. We assume that models are dynamic, but that model composition is static, in the sense that combinator application takes place prior to evaluating the model on data. Combinators provide primitives for both model and inference composition. Model combinators take the form of classic functional programming constructs such as map and reduce. These constructs define a computation graph at a coarsened level of representation, in which nodes correspond to models, rather than individual variables. Inference combinators - such as enumeration, importance resampling, and Markov Chain Monte Carlo operators - assume a sampling semantics for model evaluation, in which application of combinators preserves proper weighting. Owing to this property, models defined using combinators can be trained using stochastic methods that optimize either variational or wake-sleep style objectives. As a validation of this principle, we use combinators to implement black box inference for hidden Markov models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/14/2016

Spreadsheet Probabilistic Programming

Spreadsheet workbook contents are simple programs. Because of this, prob...
research
06/18/2020

Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization

Recent research has seen several advances relevant to black-box VI, but ...
research
10/01/2020

Bayesian Policy Search for Stochastic Domains

AI planning can be cast as inference in probabilistic models, and probab...
research
10/23/2020

Accelerating Metropolis-Hastings with Lightweight Inference Compilation

In order to construct accurate proposers for Metropolis-Hastings Markov ...
research
06/23/2021

Black Box Variational Bayes Model Averaging

For many decades now, Bayesian Model Averaging (BMA) has been a popular ...
research
11/09/2017

Denotational validation of higher-order Bayesian inference

We present a modular semantic account of Bayesian inference algorithms f...
research
02/17/2023

Bayesian Quantification with Black-Box Estimators

Understanding how different classes are distributed in an unlabeled data...

Please sign up or login with your details

Forgot password? Click here to reset