Posterior Convergence of Nonparametric Binary and Poisson Regression Under Possible Misspecifications

05/01/2020
by   Debashis Chatterjee, et al.
0

In this article, we investigate posterior convergence of nonparametric binary and Poisson regression under possible model misspecification, assuming general stochastic process prior with appropriate properties. Our model setup and objective for binary regression is similar to that of Ghosal and Roy (2006) where the authors have used the approach of entropy bound and exponentially consistent tests with the sieve method to achieve consistency with respect to their Gaussian process prior. In contrast, for both binary and Poisson regression, using general stochastic process prior, our approach involves verification of asymptotic equipartition property along with the method of sieve, which is a manoeuvre of the general results of Shalizi (2009), useful even for misspecified models. Moreover, we will establish not only posterior consistency but also the rates at which the posterior probabilities converge, which turns out to be the Kullback-Leibler divergence rate. We also investgate the traditional posterior convergence rates. Interestingly, from subjective Bayesian viewpoint we will show that the posterior predictive distribution can accurately approximate the best possible predictive distribution in the sense that the Hellinger distance, as well as the total variation distance between the two distributions can tend to zero, in spite of misspecifications.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/24/2018

Posterior Convergence of Gaussian and General Stochastic Process Regression Under Possible Misspecifications

In this article, we investigate posterior convergence in nonparametric r...
research
07/31/2013

Posterior Contraction Rates of the Phylogenetic Indian Buffet Processes

By expressing prior distributions as general stochastic processes, nonpa...
research
11/27/2020

Equivalence of Convergence Rates of Posterior Distributions and Bayes Estimators for Functions and Nonparametric Functionals

We study the posterior contraction rates of a Bayesian method with Gauss...
research
07/26/2021

From robust tests to Bayes-like posterior distributions

In the Bayes paradigm and for a given loss function, we propose the cons...
research
02/12/2021

Stochastic Convergence Rates and Applications of Adaptive Quadrature in Bayesian Inference

We provide the first stochastic convergence rates for adaptive Gauss–Her...
research
03/12/2020

Posterior asymptotics in Wasserstein metrics on the real line

In this paper, we use the class of Wasserstein metrics to study asymptot...
research
08/27/2021

Convergence Rates for Learning Linear Operators from Noisy Data

We study the Bayesian inverse problem of learning a linear operator on a...

Please sign up or login with your details

Forgot password? Click here to reset