Generalized Bayesian Likelihood-Free Inference Using Scoring Rules Estimators
We propose a framework for Bayesian Likelihood-Free Inference (LFI) based on Generalized Bayesian Inference using scoring rules (SRs). SRs are used to evaluate probabilistic models given an observation; a proper SR is minimised in expectation when the model corresponds to the data generating process for the observations. Using a strictly proper SR, for which the above minimum is unique, ensures posterior consistency of our method. Further, we prove finite sample posterior consistency and outlier robustness of our posterior for the Kernel and Energy Scores. As the likelihood function is intractable for LFI, we employ consistent estimators of SRs using model simulations in a pseudo-marginal MCMC; we show the target of such chain converges to the exact SR posterior by increasing the number of simulations. Furthermore, we note popular LFI techniques such as Bayesian Synthetic Likelihood (BSL) can be seen as special cases of our framework using only proper (but not strictly so) SR. We empirically validate our consistency and outlier robustness results and show how related approaches do not enjoy these properties. Practically, we use the Energy and Kernel Scores, but our general framework sets the stage for extensions with other scoring rules.
READ FULL TEXT