DeepAI AI Chat
Log In Sign Up

Cramer-Rao Bound for Estimation After Model Selection and its Application to Sparse Vector Estimation

04/15/2019
by   Elad Meir, et al.
Ben-Gurion University of the Negev
0

In many practical parameter estimation problems, such as coefficient estimation of polynomial regression and direction-of-arrival (DOA) estimation, model selection is performed prior to estimation. In these cases, it is assumed that the true measurement model belongs to a set of candidate models. The data-based model selection step affects the subsequent estimation, which may result in a biased estimation. In particular, the oracle Cramer-Rao bound (CRB), which assumes knowledge of the model, is inappropriate for post-model-selection performance analysis and system design outside the asymptotic region. In this paper, we analyze the estimation performance of post-model-selection estimators, by using the mean-squared-selected-error (MSSE) criterion. We assume coherent estimators that force unselected parameters to zero, and introduce the concept of selective unbiasedness in the sense of Lehmann unbiasedness. We derive a non-Bayesian Cramer-Rao-type bound on the MSSE and on the mean-squared-error (MSE) of any coherent and selective unbiased estimators. As an important special case, we illustrate the computation and applicability of the proposed selective CRB for sparse vector estimation, in which the selection of a model is equivalent to the recovery of the support. Finally, we demonstrate in numerical simulations that the proposed selective CRB is a valid lower bound on the performance of the post-model-selection maximum likelihood estimator for general linear model with different model selection criteria, and for sparse vector estimation with one-step thresholding. It is shown that for these cases the selective CRB outperforms the existing bounds: oracle CRB, averaged CRB, and the SMS-CRB from [1].

READ FULL TEXT
04/17/2023

Barankin-Type Bound for Constrained Parameter Estimation

In constrained parameter estimation, the classical constrained Cramer-Ra...
03/25/2020

Robustness Analysis of the Data-Selective Volterra NLMS Algorithm

Recently, the data-selective adaptive Volterra filters have been propose...
02/07/2018

New Cramer-Rao-Type Bound for Constrained Parameter Estimation

Non-Bayesian parameter estimation under parametric constraints is encoun...
11/13/2020

A Generalized Focused Information Criterion for GMM

This paper proposes a criterion for simultaneous GMM model and moment se...
03/21/2010

On MMSE and MAP Denoising Under Sparse Representation Modeling Over a Unitary Dictionary

Among the many ways to model signals, a recent approach that draws consi...
07/31/2015

Robustness in sparse linear models: relative efficiency based on robust approximate message passing

Understanding efficiency in high dimensional linear models is a longstan...