Cramer-Rao Bound for Estimation After Model Selection and its Application to Sparse Vector Estimation

04/15/2019
by   Elad Meir, et al.
0

In many practical parameter estimation problems, such as coefficient estimation of polynomial regression and direction-of-arrival (DOA) estimation, model selection is performed prior to estimation. In these cases, it is assumed that the true measurement model belongs to a set of candidate models. The data-based model selection step affects the subsequent estimation, which may result in a biased estimation. In particular, the oracle Cramer-Rao bound (CRB), which assumes knowledge of the model, is inappropriate for post-model-selection performance analysis and system design outside the asymptotic region. In this paper, we analyze the estimation performance of post-model-selection estimators, by using the mean-squared-selected-error (MSSE) criterion. We assume coherent estimators that force unselected parameters to zero, and introduce the concept of selective unbiasedness in the sense of Lehmann unbiasedness. We derive a non-Bayesian Cramer-Rao-type bound on the MSSE and on the mean-squared-error (MSE) of any coherent and selective unbiased estimators. As an important special case, we illustrate the computation and applicability of the proposed selective CRB for sparse vector estimation, in which the selection of a model is equivalent to the recovery of the support. Finally, we demonstrate in numerical simulations that the proposed selective CRB is a valid lower bound on the performance of the post-model-selection maximum likelihood estimator for general linear model with different model selection criteria, and for sparse vector estimation with one-step thresholding. It is shown that for these cases the selective CRB outperforms the existing bounds: oracle CRB, averaged CRB, and the SMS-CRB from [1].

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset