Learning Proposals for Probabilistic Programs with Inference Combinators

03/01/2021
by   Sam Stites, et al.
0

We develop operators for construction of proposals in probabilistic programs, which we refer to as inference combinators. Inference combinators define a grammar over importance samplers that compose primitive operations such as application of a transition kernel and importance resampling. Proposals in these samplers can be parameterized using neural networks, which in turn can be trained by optimizing variational objectives. The result is a framework for user-programmable variational methods that are correct by construction and can be tailored to specific models. We demonstrate the flexibility of this framework by implementing advanced variational methods based on amortized Gibbs sampling and annealing.

READ FULL TEXT
research
01/11/2018

Using probabilistic programs as proposals

Monte Carlo inference has asymptotic guarantees, but can be slow when us...
research
08/21/2017

Neural Block Sampling

Efficient Monte Carlo inference often requires manual construction of mo...
research
11/04/2019

Amortized Population Gibbs Samplers with Neural Sufficient Statistics

We develop amortized population Gibbs (APG) samplers, a new class of aut...
research
06/21/2021

Nested Variational Inference

We develop nested variational inference (NVI), a family of methods that ...
research
10/20/2019

Amortized Rejection Sampling in Universal Probabilistic Programming

Existing approaches to amortized inference in probabilistic programs wit...
research
06/19/2015

Variational Gaussian Copula Inference

We utilize copulas to constitute a unified framework for constructing an...

Please sign up or login with your details

Forgot password? Click here to reset