Unbiased inference for discretely observed hidden Markov model diffusions

07/26/2018
by   Jordan Franks, et al.
0

We develop an importance sampling (IS) type estimator for Bayesian joint inference on the model parameters and latent states of a class of hidden Markov models (HMMs). We are interested in the class of HMMs for which the hidden state dynamics is a diffusion process and noisy observations are obtained at discrete times. We suppose that the diffusion dynamics can not be simulated exactly and hence one must time-discretise the diffusion. Our approach is based on particle marginal Metropolis--Hastings (PMMH), particle filters (PFs), and (randomised) multilevel Monte Carlo (rMLMC). The estimator is built upon a single run of PMMH using a coarse discretisation of the model. The consequent IS type correction is based on a single-term rMLMC estimator using output from a PF developed for level difference integral estimation. The resulting IS type estimator leads to inference without a bias from the time-discretisation. We give convergence results, such as a central limit theorem, and recommend allocations for algorithm inputs. The generality of our method sets it apart from extant unbiased methods based on exact simulation, which require strong conditions on the HMM diffusion. Moreover, our method is highly parallelisable. We illustrate our method on two examples from the literature.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset