Optimal Thinning of MCMC Output

05/08/2020
by   Marina Riabiz, et al.
0

The use of heuristics to assess the convergence and compress the output of Markov chain Monte Carlo can be sub-optimal in terms of the empirical approximations that are produced. Typically a number of the initial states are attributed to "burn in" and removed, whilst the remainder of the chain is "thinned" if compression is also required. In this paper we consider the problem of retrospectively selecting a subset of states, of fixed cardinality, from the sample path such that the approximation provided by their empirical distribution is close to optimal. A novel method is proposed, based on greedy minimisation of a kernel Stein discrepancy, that is suitable for problems where heavy compression is required. Theoretical results guarantee consistency of the method and its effectiveness is demonstrated in the challenging context of parameter inference for ordinary differential equations. Software is available in the Stein Thinning package in both Python and MATLAB.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/09/2021

Fast compression of MCMC output

We propose cube thinning, a novel method for compressing the output of a...
research
05/09/2019

Stein Point Markov Chain Monte Carlo

An important task in machine learning and statistics is the approximatio...
research
01/10/2021

The shifted ODE method for underdamped Langevin MCMC

In this paper, we consider the underdamped Langevin diffusion (ULD) and ...
research
08/17/2023

Bayesian polynomial neural networks and polynomial neural ordinary differential equations

Symbolic regression with polynomial neural networks and polynomial neura...
research
03/28/2014

Accelerating MCMC via Parallel Predictive Prefetching

We present a general framework for accelerating a large class of widely ...
research
01/31/2023

Kernel Stein Discrepancy thinning: a theoretical perspective of pathologies and a practical fix with regularization

Stein thinning is a promising algorithm proposed by (Riabiz et al., 2022...

Please sign up or login with your details

Forgot password? Click here to reset