Optimal classification in sparse Gaussian graphic model

12/21/2012
by   Yingying Fan, et al.
0

Consider a two-class classification problem where the number of features is much larger than the sample size. The features are masked by Gaussian noise with mean zero and covariance matrix Σ, where the precision matrix Ω=Σ^-1 is unknown but is presumably sparse. The useful features, also unknown, are sparse and each contributes weakly (i.e., rare and weak) to the classification decision. By obtaining a reasonably good estimate of Ω, we formulate the setting as a linear regression model. We propose a two-stage classification method where we first select features by the method of Innovated Thresholding (IT), and then use the retained features and Fisher's LDA for classification. In this approach, a crucial problem is how to set the threshold of IT. We approach this problem by adapting the recent innovation of Higher Criticism Thresholding (HCT). We find that when useful features are rare and weak, the limiting behavior of HCT is essentially just as good as the limiting behavior of ideal threshold, the threshold one would choose if the underlying distribution of the signals is known (if only). Somewhat surprisingly, when Ω is sufficiently sparse, its off-diagonal coordinates usually do not have a major influence over the classification decision. Compared to recent work in the case where Ω is the identity matrix [Proc. Natl. Acad. Sci. USA 105 (2008) 14790-14795; Philos. Trans. R. Soc. Lond. Ser. A Math. Phys. Eng. Sci. 367 (2009) 4449-4470], the current setting is much more general, which needs a new approach and much more sophisticated analysis. One key component of the analysis is the intimate relationship between HCT and Fisher's separation. Another key component is the tight large-deviation bounds for empirical processes for data with unconventional correlation structures, where graph theory on vertex coloring plays an important role.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/24/2021

Phase Transitions for High-Dimensional Quadratic Discriminant Analysis with Rare and Weak Signals

Consider a two-class classification problem where we observe samples (X_...
research
01/21/2013

Supervised Classification Using Sparse Fisher's LDA

It is well known that in a supervised classification setting when the nu...
research
06/08/2018

Estimation of Covariance Matrices for Portfolio Optimization using Gaussian Processes

Estimating covariances between financial assets plays an important role ...
research
04/15/2016

Probing the Intra-Component Correlations within Fisher Vector for Material Classification

Fisher vector (FV) has become a popular image representation. One notabl...
research
10/06/2018

Adapting to Unknown Noise Distribution in Matrix Denoising

We consider the problem of estimating an unknown matrix ∈^m× n, from obs...
research
01/23/2019

Optimal Uncertainty Size in Distributionally Robust Inverse Covariance Estimation

In a recent paper, Nguyen, Kuhn, and Esfahani (2018) built a distributio...

Please sign up or login with your details

Forgot password? Click here to reset