Indirect Gaussian Graph Learning beyond Gaussianity

10/08/2016
by   Yiyuan She, et al.
0

This paper studies how to capture dependency graph structures from real data which may not be multivariate Gaussian. Starting from marginal loss functions not necessarily derived from probability distributions, we use an additive over-parametrization with shrinkage to incorporate variable dependencies into the criterion. An iterative Gaussian graph learning algorithm is proposed with ease in implementation. Statistical analysis shows that with the error measured in terms of a proper Bregman divergence, the estimators have fast rate of convergence. Real-life examples in different settings are given to demonstrate the efficacy of the proposed methodology.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/03/2019

A Bayesian Semiparametric Gaussian Copula Approach to a Multivariate Normality Test

In this paper, a Bayesian semiparametric copula approach is used to mode...
research
09/15/2021

Generalized XGBoost Method

The XGBoost method has many advantages and is especially suitable for st...
research
10/16/2019

Multivariate Forecasting Evaluation: On Sensitive and Strictly Proper Scoring Rules

In recent years, probabilistic forecasting is an emerging topic, which i...
research
12/15/2021

Gaining Outlier Resistance with Progressive Quantiles: Fast Algorithms and Theoretical Studies

Outliers widely occur in big-data applications and may severely affect s...
research
10/21/2010

Reading Dependencies from Covariance Graphs

The covariance graph (aka bi-directed graph) of a probability distributi...
research
10/18/2021

The f-divergence and Loss Functions in ROC Curve

Given two data distributions and a test score function, the Receiver Ope...
research
09/29/2009

Iterative Shrinkage Approach to Restoration of Optical Imagery

The problem of reconstruction of digital images from their degraded meas...

Please sign up or login with your details

Forgot password? Click here to reset