
Automated Variational Inference in Probabilistic Programming
We present a new algorithm for approximate inference in probabilistic pr...
read it

Elements of Sequential Monte Carlo
A core problem in statistics and probabilistic machine learning is to co...
read it

Predictive CoarseGraining
We propose a datadriven, coarsegraining formulation in the context of ...
read it

A Dynamic Programming Algorithm for Inference in Recursive Probabilistic Programs
We describe a dynamic programming algorithm for computing the marginal d...
read it

OutputSensitive Adaptive MetropolisHastings for Probabilistic Programs
We introduce an adaptive outputsensitive MetropolisHastings algorithm ...
read it

ContextSpecific Approximation in Probabilistic Inference
There is evidence that the numbers in probabilistic inference don't real...
read it

PClean: Bayesian Data Cleaning at Scale with DomainSpecific Probabilistic Programming
Data cleaning can be naturally framed as probabilistic inference in a ge...
read it
CoarsetoFine Sequential Monte Carlo for Probabilistic Programs
Many practical techniques for probabilistic inference require a sequence of distributions that interpolate between a tractable distribution and an intractable distribution of interest. Usually, the sequences used are simple, e.g., based on geometric averages between distributions. When models are expressed as probabilistic programs, the models themselves are highly structured objects that can be used to derive annealing sequences that are more sensitive to domain structure. We propose an algorithm for transforming probabilistic programs to coarsetofine programs which have the same marginal distribution as the original programs, but generate the data at increasing levels of detail, from coarse to fine. We apply this algorithm to an Ising model, its depthfromdisparity variation, and a factorial hidden Markov model. We show preliminary evidence that the use of coarsetofine models can make existing generic inference algorithms more efficient.
READ FULL TEXT
Comments
There are no comments yet.