-
Automated Variational Inference in Probabilistic Programming
We present a new algorithm for approximate inference in probabilistic pr...
read it
-
Elements of Sequential Monte Carlo
A core problem in statistics and probabilistic machine learning is to co...
read it
-
Predictive Coarse-Graining
We propose a data-driven, coarse-graining formulation in the context of ...
read it
-
A Dynamic Programming Algorithm for Inference in Recursive Probabilistic Programs
We describe a dynamic programming algorithm for computing the marginal d...
read it
-
Output-Sensitive Adaptive Metropolis-Hastings for Probabilistic Programs
We introduce an adaptive output-sensitive Metropolis-Hastings algorithm ...
read it
-
Context-Specific Approximation in Probabilistic Inference
There is evidence that the numbers in probabilistic inference don't real...
read it
-
PClean: Bayesian Data Cleaning at Scale with Domain-Specific Probabilistic Programming
Data cleaning can be naturally framed as probabilistic inference in a ge...
read it
Coarse-to-Fine Sequential Monte Carlo for Probabilistic Programs
Many practical techniques for probabilistic inference require a sequence of distributions that interpolate between a tractable distribution and an intractable distribution of interest. Usually, the sequences used are simple, e.g., based on geometric averages between distributions. When models are expressed as probabilistic programs, the models themselves are highly structured objects that can be used to derive annealing sequences that are more sensitive to domain structure. We propose an algorithm for transforming probabilistic programs to coarse-to-fine programs which have the same marginal distribution as the original programs, but generate the data at increasing levels of detail, from coarse to fine. We apply this algorithm to an Ising model, its depth-from-disparity variation, and a factorial hidden Markov model. We show preliminary evidence that the use of coarse-to-fine models can make existing generic inference algorithms more efficient.
READ FULL TEXT
Comments
There are no comments yet.