DeepAI AI Chat
Log In Sign Up

An efficient ADMM algorithm for high dimensional precision matrix estimation via penalized quadratic loss

11/12/2018
by   Cheng Wang, et al.
0

The estimation of high dimensional precision matrices has been a central topic in statistical learning. However, as the number of parameters scales quadratically with the dimension p, many state-of-the-art methods do not scale well to solve problems with a very large p. In this paper, we propose a very efficient algorithm for precision matrix estimation via penalized quadratic loss functions. Under the high dimension low sample size setting, the computation complexity of our algorithm is linear in the sample size and the number of parameters, which is the same as computing the sample covariance matrix.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/25/2021

MARS: A second-order reduction algorithm for high-dimensional sparse precision matrices estimation

Estimation of the precision matrix (or inverse covariance matrix) is of ...
01/22/2019

A Fast Iterative Algorithm for High-dimensional Differential Network

Differential network is an important tool to capture the changes of cond...
01/22/2019

Penalized Interaction Estimation for Ultrahigh Dimensional Quadratic Regression

Quadratic regression goes beyond the linear model by simultaneously incl...
06/18/2012

Adaptive Regularization for Weight Matrices

Algorithms for learning distributions over weight-vectors, such as AROW ...
12/07/2021

Mesh-Based Solutions for Nonparametric Penalized Regression

It is often of interest to estimate regression functions non-parametrica...
05/21/2008

Kendall's tau in high-dimensional genomic parsimony

High-dimensional data models, often with low sample size, abound in many...
03/07/2021

Multilevel approximation of Gaussian random fields: Covariance compression, estimation and spatial prediction

Centered Gaussian random fields (GRFs) indexed by compacta such as smoot...