DeepAI AI Chat
Log In Sign Up

Ledoit-Wolf linear shrinkage with unknown mean

04/14/2023
by   Benoît Oriol, et al.
0

This work addresses large dimensional covariance matrix estimation with unknown mean. The empirical covariance estimator fails when dimension and number of samples are proportional and tend to infinity, settings known as Kolmogorov asymptotics. When the mean is known, Ledoit and Wolf (2004) proposed a linear shrinkage estimator and proved its convergence under those asymptotics. To the best of our knowledge, no formal proof has been proposed when the mean is unknown. To address this issue, we propose a new estimator and prove its quadratic convergence under the Ledoit and Wolf assumptions. Finally, we show empirically that it outperforms other standard estimators.

READ FULL TEXT

page 11

page 12

03/22/2021

Optimal Linear Classification via Eigenvalue Shrinkage: The Case of Additive Noise

In this paper, we consider the general problem of testing the mean of tw...
08/30/2018

Optimal shrinkage covariance matrix estimation under random sampling from elliptical distributions

This paper considers the problem of estimating a high-dimensional (HD) c...
06/17/2020

Shrinking the eigenvalues of M-estimators of covariance matrix

A highly popular regularized (shrinkage) covariance matrix estimator is ...
04/25/2019

Nonparametric Estimation and Inference in Psychological and Economic Experiments

The goal of this paper is to provide some statistical tools for nonparam...
02/27/2023

The Local Ledoit-Peche Law

Ledoit and Peche proved convergence of certain functions of a random cov...
07/03/2021

Cleaning large-dimensional covariance matrices for correlated samples

A non-linear shrinkage estimator of large-dimensional covariance matrice...
04/07/2014

Tyler's Covariance Matrix Estimator in Elliptical Models with Convex Structure

We address structured covariance estimation in elliptical distributions ...