Particle-based, rapid incremental smoother meets particle Gibbs

09/21/2022
by   Gabriel Cardoso, et al.
0

The particle-based, rapid incremental smoother (PARIS) is a sequential Monte Carlo technique allowing for efficient online approximation of expectations of additive functionals under Feynman–Kac path distributions. Under weak assumptions, the algorithm has linear computational complexity and limited memory requirements. It also comes with a number of non-asymptotic bounds and convergence results. However, being based on self-normalised importance sampling, the PARIS estimator is biased; its bias is inversely proportional to the number of particles but has been found to grow linearly with the time horizon under appropriate mixing conditions. In this work, we propose the Parisian particle Gibbs (PPG) sampler, whose complexity is essentially the same as that of the PARIS and which significantly reduces the bias for a given computational complexity at the price of a modest increase in the variance. This method is a wrapper in the sense that it uses the PARIS algorithm in the inner loop of particle Gibbs to form a bias-reduced version of the targeted quantities. We substantiate the PPG algorithm with theoretical results, including new bounds on bias and variance as well as deviation inequalities. We illustrate our theoretical results with numerical experiments supporting our claims.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset