DeepAI AI Chat
Log In Sign Up

Composing Modeling and Inference Operations with Probabilistic Program Combinators

by   Eli Sennesh, et al.
Northeastern University

Probabilistic programs with dynamic computation graphs can define measures over sample spaces with unbounded dimensionality, and thereby constitute programmatic analogues to Bayesian nonparametrics. Owing to the generality of this model class, inference relies on "black-box" Monte Carlo methods that are generally not able to take advantage of conditional independence and exchangeability, which have historically been the cornerstones of efficient inference. We here seek to develop a "middle ground" between probabilistic models with fully dynamic and fully static computation graphs. To this end, we introduce a combinator library for the Probabilistic Torch framework. Combinators are functions that accept models and return transformed models. We assume that models are dynamic, but that model composition is static, in the sense that combinator application takes place prior to evaluating the model on data. Combinators provide primitives for both model and inference composition. Model combinators take the form of classic functional programming constructs such as map and reduce. These constructs define a computation graph at a coarsened level of representation, in which nodes correspond to models, rather than individual variables. Inference combinators - such as enumeration, importance resampling, and Markov Chain Monte Carlo operators - assume a sampling semantics for model evaluation, in which application of combinators preserves proper weighting. Owing to this property, models defined using combinators can be trained using stochastic methods that optimize either variational or wake-sleep style objectives. As a validation of this principle, we use combinators to implement black box inference for hidden Markov models.


page 1

page 2

page 3

page 4


Spreadsheet Probabilistic Programming

Spreadsheet workbook contents are simple programs. Because of this, prob...

Advances in Black-Box VI: Normalizing Flows, Importance Weighting, and Optimization

Recent research has seen several advances relevant to black-box VI, but ...

Bayesian Policy Search for Stochastic Domains

AI planning can be cast as inference in probabilistic models, and probab...

Accelerating Metropolis-Hastings with Lightweight Inference Compilation

In order to construct accurate proposers for Metropolis-Hastings Markov ...

Black Box Variational Bayes Model Averaging

For many decades now, Bayesian Model Averaging (BMA) has been a popular ...

Denotational validation of higher-order Bayesian inference

We present a modular semantic account of Bayesian inference algorithms f...

Bayesian Quantification with Black-Box Estimators

Understanding how different classes are distributed in an unlabeled data...