
Encapsulating models and approximate inference programs in probabilistic modules
This paper introduces the probabilistic module interface, which allows e...
read it

Towards Practical Bayesian Parameter and State Estimation
Joint state and parameter estimation is a core problem for dynamic Bayes...
read it

Venture: a higherorder probabilistic programming platform with programmable inference
We describe Venture, an interactive virtual machine for probabilistic pr...
read it

Getting Started with Particle MetropolisHastings for Inference in Nonlinear Dynamical Models
This tutorial provides a gentle introduction to the particle Metropolis...
read it

Approximate Bayesian Image Interpretation using Generative Probabilistic Graphics Programs
The idea of computer vision as the Bayesian inverse problem to computer ...
read it

BlockValue Symmetries in Probabilistic Graphical Models
Several lifted inference algorithms for probabilistic graphical models f...
read it

Bayesian Optimization for Probabilistic Programs
We present the first general purpose framework for marginal maximum a po...
read it
SublinearTime Approximate MCMC Transitions for Probabilistic Programs
Probabilistic programming languages can simplify the development of machine learning techniques, but only if inference is sufficiently scalable. Unfortunately, Bayesian parameter estimation for highly coupled models such as regressions and statespace models still scales poorly; each MCMC transition takes linear time in the number of observations. This paper describes a sublineartime algorithm for making MetropolisHastings (MH) updates to latent variables in probabilistic programs. The approach generalizes recently introduced approximate MH techniques: instead of subsampling data items assumed to be independent, it subsamples edges in a dynamically constructed graphical model. It thus applies to a broader class of problems and interoperates with other generalpurpose inference techniques. Empirical results, including confirmation of sublinear pertransition scaling, are presented for Bayesian logistic regression, nonlinear classification via joint Dirichlet process mixtures, and parameter estimation for stochastic volatility models (with state estimation via particle MCMC). All three applications use the same implementation, and each requires under 20 lines of probabilistic code.
READ FULL TEXT
Comments
There are no comments yet.