
Encapsulating models and approximate inference programs in probabilistic modules
This paper introduces the probabilistic module interface, which allows e...
12/14/2016 ∙ by Marco F. CusumanoTowner, et al. ∙ 0 ∙ shareread it

Towards Practical Bayesian Parameter and State Estimation
Joint state and parameter estimation is a core problem for dynamic Bayes...
03/29/2016 ∙ by Yusuf Bugra Erol, et al. ∙ 0 ∙ shareread it

Venture: a higherorder probabilistic programming platform with programmable inference
We describe Venture, an interactive virtual machine for probabilistic pr...
04/01/2014 ∙ by Vikash Mansinghka, et al. ∙ 0 ∙ shareread it

Getting Started with Particle MetropolisHastings for Inference in Nonlinear Dynamical Models
This tutorial provides a gentle introduction to the particle Metropolis...
11/05/2015 ∙ by Johan Dahlin, et al. ∙ 0 ∙ shareread it

Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs
The idea of computer vision as the Bayesian inverse problem to computer ...
06/29/2013 ∙ by Vikash K. Mansinghka, et al. ∙ 0 ∙ shareread it

Bayesian Parameter Estimation for Latent Markov Random Fields and Social Networks
Undirected graphical models are widely used in statistics, physics and m...
03/14/2012 ∙ by Richard G. Everitt, et al. ∙ 0 ∙ shareread it

BlockValue Symmetries in Probabilistic Graphical Models
Several lifted inference algorithms for probabilistic graphical models f...
07/02/2018 ∙ by Gagan Madan, et al. ∙ 0 ∙ shareread it
SublinearTime Approximate MCMC Transitions for Probabilistic Programs
Probabilistic programming languages can simplify the development of machine learning techniques, but only if inference is sufficiently scalable. Unfortunately, Bayesian parameter estimation for highly coupled models such as regressions and statespace models still scales poorly; each MCMC transition takes linear time in the number of observations. This paper describes a sublineartime algorithm for making MetropolisHastings (MH) updates to latent variables in probabilistic programs. The approach generalizes recently introduced approximate MH techniques: instead of subsampling data items assumed to be independent, it subsamples edges in a dynamically constructed graphical model. It thus applies to a broader class of problems and interoperates with other generalpurpose inference techniques. Empirical results, including confirmation of sublinear pertransition scaling, are presented for Bayesian logistic regression, nonlinear classification via joint Dirichlet process mixtures, and parameter estimation for stochastic volatility models (with state estimation via particle MCMC). All three applications use the same implementation, and each requires under 20 lines of probabilistic code.
READ FULL TEXT