Bayesian model selection consistency and oracle inequality with intractable marginal likelihood

01/02/2017
by   Yun Yang, et al.
0

In this article, we investigate large sample properties of model selection procedures in a general Bayesian framework when a closed form expression of the marginal likelihood function is not available or a local asymptotic quadratic approximation of the log-likelihood function does not exist. Under appropriate identifiability assumptions on the true model, we provide sufficient conditions for a Bayesian model selection procedure to be consistent and obey the Occam's razor phenomenon, i.e., the probability of selecting the "smallest" model that contains the truth tends to one as the sample size goes to infinity. In order to show that a Bayesian model selection procedure selects the smallest model containing the truth, we impose a prior anti-concentration condition, requiring the prior mass assigned by large models to a neighborhood of the truth to be sufficiently small. In a more general setting where the strong model identifiability assumption may not hold, we introduce the notion of local Bayesian complexity and develop oracle inequalities for Bayesian model selection procedures. Our Bayesian oracle inequality characterizes a trade-off between the approximation error and a Bayesian characterization of the local complexity of the model, illustrating the adaptive nature of averaging-based Bayesian procedures towards achieving an optimal rate of posterior convergence. Specific applications of the model selection theory are discussed in the context of high-dimensional nonparametric regression and density regression where the regression function or the conditional density is assumed to depend on a fixed subset of predictors. As a result of independent interest, we propose a general technique for obtaining upper bounds of certain small ball probability of stationary Gaussian processes.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/01/2012

Oracle inequalities for computationally adaptive model selection

We analyze general model selection procedures using penalized empirical ...
research
01/25/2023

Model selection-based estimation for generalized additive models using mixtures of g-priors: Towards systematization

We consider estimation of generalized additive models using basis expans...
research
07/15/2020

A Bayesian Multiple Testing Paradigm for Model Selection in Inverse Regression Problems

In this article, we propose a novel Bayesian multiple testing formulatio...
research
06/11/2018

A framework for posterior consistency in model selection

We develop a theoretical framework for the frequentist assessment of Bay...
research
04/18/2008

Margin-adaptive model selection in statistical learning

A classical condition for fast learning rates is the margin condition, f...
research
02/08/2019

Bayesian Model Selection with Graph Structured Sparsity

We propose a general algorithmic framework for Bayesian model selection....

Please sign up or login with your details

Forgot password? Click here to reset