Estimation of positive definite M-matrices and structure learning for attractive Gaussian Markov Random fields
Consider a random vector with finite second moments. If its precision matrix is an M-matrix, then all partial correlations are non-negative. If that random vector is additionally Gaussian, the corresponding Markov random field (GMRF) is called attractive. We study estimation of M-matrices taking the role of inverse second moment or precision matrices using sign-constrained log-determinant divergence minimization. We also treat the high-dimensional case with the number of variables exceeding the sample size. The additional sign-constraints turn out to greatly simplify the estimation problem: we provide evidence that explicit regularization is no longer required. To solve the resulting convex optimization problem, we propose an algorithm based on block coordinate descent, in which each sub-problem can be recast as non-negative least squares problem. Illustrations on both simulated and real world data are provided.
READ FULL TEXT