Model Mismatch Trade-offs in LMMSE Estimation

by   Martin Hellkvist, et al.
Uppsala universitet

We consider a linear minimum mean squared error (LMMSE) estimation framework with model mismatch where the assumed model order is smaller than that of the underlying linear system which generates the data used in the estimation process. By modelling the regressors of the underlying system as random variables, we analyze the average behaviour of the mean squared error (MSE). Our results quantify how the MSE depends on the interplay between the number of samples and the number of parameters in the underlying system and in the assumed model. In particular, if the number of samples is not sufficiently large, neither increasing the number of samples nor the assumed model complexity is sufficient to guarantee a performance improvement.


page 1

page 2

page 3

page 4


Convergence of Chao Unseen Species Estimator

Support size estimation and the related problem of unseen species estima...

How many samples are needed to reliably approximate the best linear estimator for a linear inverse problem?

The linear minimum mean squared error (LMMSE) estimator is the best line...

Uncertainty Principles in Risk-Aware Statistical Estimation

We present a new uncertainty principle for risk-aware statistical estima...

Minimum Mean-Squared-Error Autocorrelation Processing in Coprime Arrays

Coprime arrays enable Direction-of-Arrival (DoA) estimation of an increa...

Dirichlet Mixture Model based VQ Performance Prediction for Line Spectral Frequency

In this paper, we continue our previous work on the Dirichlet mixture mo...

Predicting IMDb Rating of TV Series with Deep Learning: The Case of Arrow

Context: The number of TV series offered nowadays is very high. Due to i...

Estimation and Model Misspecification: Fake and Missing Features

We consider estimation under model misspecification where there is a mod...

Please sign up or login with your details

Forgot password? Click here to reset