Finite sample performance of linear least squares estimation

10/12/2018
by   Michael Krikheli, et al.
0

Linear Least Squares is a very well known technique for parameter estimation, which is used even when sub-optimal, because of its very low computational requirements and the fact that exact knowledge of the noise statistics is not required. Surprisingly, bounding the probability of large errors with finitely many samples has been left open, especially when dealing with correlated noise with unknown covariance. In this paper we analyze the finite sample performance of the linear least squares estimator. Using these bounds we obtain accurate bounds on the tail of the estimator's distribution. We show the fast exponential convergence of the number of samples required to ensure a given accuracy with high probability. We analyze a sub-Gaussian setting with a fixed or random design matrix of the linear least squares problem. We also extend the results to the case of a martingale difference noise sequence. Our analysis method is simple and uses simple L_∞ type bounds on the estimation error. We also provide probabilistic finite sample bounds on the estimation error L_2 norm. The tightness of the bounds is tested through simulation. We demonstrate that our results are tighter than previously proposed bounds for L_∞ norm of the error. The proposed bounds make it possible to predict the number of samples required for least squares estimation even when the least squares is sub-optimal and is used for computational simplicity.

READ FULL TEXT
research
03/21/2019

Finite Sample Analysis of Stochastic System Identification

In this paper, we analyze the finite sample complexity of stochastic sys...
research
03/30/2022

System Identification via Nuclear Norm Regularization

This paper studies the problem of identifying low-order linear systems v...
research
08/12/2012

How to sample if you must: on optimal functional sampling

We examine a fundamental problem that models various active sampling set...
research
01/02/2020

Convergence bounds for empirical nonlinear least-squares

We consider best approximation problems in a (nonlinear) subspace M of a...
research
11/28/2019

Finite impulse response models: A non-asymptotic analysis of the least squares estimator

We consider a finite impulse response system with centered independent s...
research
04/07/2023

Graphon Estimation in bipartite graphs with observable edge labels and unobservable node labels

Many real-world data sets can be presented in the form of a matrix whose...
research
05/28/2022

Provably Auditing Ordinary Least Squares in Low Dimensions

Measuring the stability of conclusions derived from Ordinary Least Squar...

Please sign up or login with your details

Forgot password? Click here to reset