Convergence of Pseudo-Bayes Factors in Forward and Inverse Regression Problems

06/10/2020
by   Debashis Chatterjee, et al.
0

In the Bayesian literature on model comparison, Bayes factors play the leading role. In the classical statistical literature, model selection criteria are often devised used cross-validation ideas. Amalgamating the ideas of Bayes factor and cross-validation Geisser and Eddy (1979) created the pseudo-Bayes factor. The usage of cross-validation inculcates several theoretical advantages, computational simplicity and numerical stability in Bayes factors as the marginal density of the entire dataset is replaced with products of cross-validation densities of individual data points. However, the popularity of pseudo-Bayes factors is still negligible in comparison with Bayes factors, with respect to both theoretical investigations and practical applications. In this article, we establish almost sure exponential convergence of pseudo-Bayes factors for large samples under a general setup consisting of dependent data and model misspecifications. We particularly focus on general parametric and nonparametric regression setups in both forward and inverse contexts. We illustrate our theoretical results with various examples, providing explicit calculations. We also supplement our asymptotic theory with simulation experiments in small sample situations of Poisson log regression and geometric logit and probit regression, additionally addressing the variable selection problem. We consider both linear and nonparametric regression modeled by Gaussian processes for our purposes. Our simulation results provide quite interesting insights into the usage of pseudo-Bayes factors in forward and inverse setups.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset