Distributionally Robust Formulation and Model Selection for the Graphical Lasso

05/22/2019
by   Pedro Cisneros-Velarde, et al.
0

Building on a recent framework for distributionally robust optimization in machine learning, we develop a similar framework for estimation of the inverse covariance matrix for multivariate data. We provide a novel notion of a Wasserstein ambiguity set specifically tailored to this estimation problem, from which we obtain a representation for a tractable class of regularized estimators. Special cases include penalized likelihood estimators for Gaussian data, specifically the graphical lasso estimator. As a consequence of this formulation, a natural relationship arises between the radius of the Wasserstein ambiguity set and the regularization parameter in the estimation problem. Using this relationship, one can directly control the level of robustness of the estimation procedure by specifying a desired level of confidence with which the ambiguity set contains a distribution with the true population covariance. Furthermore, a unique feature of our formulation is that the radius can be expressed in closed-form as a function of the ordinary sample covariance matrix. Taking advantage of this finding, we develop a simple algorithm to determine a regularization parameter for graphical lasso, using only the bootstrapped sample covariance matrices, meaning that computationally expensive repeated evaluation of the graphical lasso algorithm is not necessary. Alternatively, the distributionally robust formulation can also quantify the robustness of the corresponding estimator if one uses an off-the-shelf method such as cross-validation. Finally, we numerically study the obtained regularization criterion and analyze the robustness of other automated tuning procedures used in practice.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/20/2021

GGLasso – a Python package for General Graphical Lasso computation

We introduce GGLasso, a Python package for solving General Graphical Las...
research
09/04/2009

Tuning parameter selection for penalized likelihood estimation of inverse covariance matrix

In a Gaussian graphical model, the conditional independence between two ...
research
11/11/2011

A note on the lack of symmetry in the graphical lasso

The graphical lasso (glasso) is a widely-used fast algorithm for estimat...
research
09/15/2022

The Influence Function of Graphical Lasso Estimators

The precision matrix that encodes conditional linear dependency relation...
research
08/29/2022

On the Lasso for Graphical Continuous Lyapunov Models

Graphical continuous Lyapunov models offer a new perspective on modeling...
research
08/18/2011

Exact covariance thresholding into connected components for large-scale Graphical Lasso

We consider the sparse inverse covariance regularization problem or grap...
research
07/28/2023

Robust and Resistant Regularized Covariance Matrices

We introduce a class of regularized M-estimators of multivariate scatter...

Please sign up or login with your details

Forgot password? Click here to reset