Particle Gibbs with Ancestor Sampling for Probabilistic Programs

01/27/2015
by   Jan-Willem van de Meent, et al.
0

Particle Markov chain Monte Carlo techniques rank among current state-of-the-art methods for probabilistic program inference. A drawback of these techniques is that they rely on importance resampling, which results in degenerate particle trajectories and a low effective sample size for variables sampled early in a program. We here develop a formalism to adapt ancestor resampling, a technique that mitigates particle degeneracy, to the probabilistic programming setting. We present empirical results that demonstrate nontrivial performance gains.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2014

Learning Probabilistic Programs

We develop a technique for generalising from data in which models are sa...
research
07/10/2019

Probabilistic programming for birth-death models of evolution using an alive particle filter with delayed sampling

We consider probabilistic programming for birth-death models of evolutio...
research
07/11/2019

Compositional Inference Metaprogramming with Convergence Guarantees

Inference metaprogramming enables effective probabilistic programming by...
research
07/31/2020

A Note on Particle Gibbs Method and its Extensions and Variants

High-dimensional state trajectories of state-space models pose challenge...
research
08/12/2013

Fighting Sample Degeneracy and Impoverishment in Particle Filters: A Review of Intelligent Approaches

During the last two decades there has been a growing interest in Particl...
research
07/20/2018

Efficient Probabilistic Inference in the Quest for Physics Beyond the Standard Model

We present a novel framework that enables efficient probabilistic infere...
research
05/31/2015

Automatic Inference for Inverting Software Simulators via Probabilistic Programming

Models of complex systems are often formalized as sequential software si...

Please sign up or login with your details

Forgot password? Click here to reset