Posterior Convergence of Gaussian and General Stochastic Process Regression Under Possible Misspecifications
In this article, we investigate posterior convergence in nonparametric regression models where the unknown regression function is modeled by some appropriate stochastic process. In this regard, we consider two setups. The first setup is based on Gaussian processes, where the covariates are either random or non-random and the noise may be either normally or double-exponentially distributed. In the second setup, we assume that the underlying regression function is modeled by some reasonably smooth, but unspecified stochastic process satisfying reasonable conditions. The distribution of the noise is also left unspecified, but assumed to be thick-tailed. As in the previous studies regarding the same problems, we do not assume that the truth lies in the postulated parameter space, thus explicitly allowing the possibilities of misspecification. We exploit the general results of Shalizi (2009) for our purpose and establish not only posterior consistency, but also the rates at which the posterior probabilities converge, which turns out to be the Kullback-Leibler divergence rate. We also investigate the more familiar posterior convergence rates. Interestingly, we show that the posterior predictive distribution can accurately approximate the best possible predictive distribution in the sense that the Hellinger distance, as well as the total variation distance between the two distributions can tend to zero, in spite of misspecifications.
READ FULL TEXT