Distributionally Robust Inverse Covariance Estimation: The Wasserstein Shrinkage Estimator

05/18/2018
by   Viet Anh Nguyen, et al.
0

We introduce a distributionally robust maximum likelihood estimation model with a Wasserstein ambiguity set to infer the inverse covariance matrix of a p-dimensional Gaussian random vector from n independent samples. The proposed model minimizes the worst case (maximum) of Stein's loss across all normal reference distributions within a prescribed Wasserstein distance from the normal distribution characterized by the sample mean and the sample covariance matrix. We prove that this estimation problem is equivalent to a semidefinite program that is tractable in theory but beyond the reach of general purpose solvers for practically relevant problem dimensions p. In the absence of any prior structural information, the estimation problem has an analytical solution that is naturally interpreted as a nonlinear shrinkage estimator. Besides being invertible and well-conditioned even for p>n, the new shrinkage estimator is rotation-equivariant and preserves the order of the eigenvalues of the sample covariance matrix. These desirable properties are not imposed ad hoc but emerge naturally from the underlying distributionally robust optimization model. Finally, we develop a sequential quadratic approximation algorithm for efficiently solving the general estimation problem subject to conditional independence constraints typically encountered in Gaussian graphical models.

READ FULL TEXT
research
02/25/2022

An Improvement on the Hotelling T^2 Test Using the Ledoit-Wolf Nonlinear Shrinkage Estimator

Hotelling's T^2 test is a classical approach for discriminating the mean...
research
07/27/2017

Sequential Inverse Approximation of a Regularized Sample Covariance Matrix

One of the goals in scaling sequential machine learning methods pertains...
research
06/17/2020

Shrinking the eigenvalues of M-estimators of covariance matrix

A highly popular regularized (shrinkage) covariance matrix estimator is ...
research
07/08/2020

Robust Bayesian Classification Using an Optimistic Score Ratio

We build a Bayesian contextual classification model using an optimistic ...
research
10/14/2021

Near optimal sample complexity for matrix and tensor normal models via geodesic convexity

The matrix normal model, the family of Gaussian matrix-variate distribut...
research
01/17/2018

Bayesian Estimation of Gaussian Graphical Models with Projection Predictive Selection

Gaussian graphical models are used for determining conditional relations...
research
03/08/2019

Random Matrix-Improved Estimation of the Wasserstein Distance between two Centered Gaussian Distributions

This article proposes a method to consistently estimate functionals 1/p∑...

Please sign up or login with your details

Forgot password? Click here to reset