Phase Transitions for High-Dimensional Quadratic Discriminant Analysis with Rare and Weak Signals

08/24/2021
by   Wanjie Wang, et al.
0

Consider a two-class classification problem where we observe samples (X_i, Y_i) for i = 1, ..., n, X_i ∈ℛ^p and Y_i ∈{0, 1}. Given Y_i = k, X_i is assumed to follow a multivariate normal distribution with mean μ_k ∈ℛ^k and covariance matrix Σ_k, k=0,1. Supposing a new sample X from the same mixture is observed, our goal is to estimate its class label Y. The difficulty lies in the rarity and weakness of the differences in the mean vector and in the covariance matrices. By incorporating the quadratic terms Ω_k=Σ^-1_k from the two classes, we formulate the likelihood-based classification as a Quadratic Discriminant Analysis (QDA) problem. Hence, we propose the QDA classification method with the feature-selection step. Compared with recent work on the linear case (LDA) with Ω_k assumed to be the same, the current setting is much more general. The numerical results from real datasets support our theories and demonstrate the necessity and superiority of using QDA over LDA for classification under the rare and weak model. We set up a rare and weak model for both the mean vector and the precision matrix. With the model parameters, we clearly depict the boundary separating the region of successful classification from the region of unsuccessful classification of the newly proposed QDA with a feature-selection method, for the two cases that μ_k is either known or unknown. We also explore the region of successful classification of the QDA approach when both μ_k and Ω_k are unknown. The results again suggest that the quadratic term has a major influence over the LDA for the classification decision and classification accuracy.

READ FULL TEXT
research
06/25/2020

High-Dimensional Quadratic Discriminant Analysis under Spiked Covariance Model

Quadratic discriminant analysis (QDA) is a widely used classification te...
research
12/21/2012

Optimal classification in sparse Gaussian graphic model

Consider a two-class classification problem where the number of features...
research
11/07/2011

Discriminant Analysis with Adaptively Pooled Covariance

Linear and Quadratic Discriminant analysis (LDA/QDA) are common tools fo...
research
03/16/2015

High-dimensional quadratic classifiers in non-sparse settings

We consider high-dimensional quadratic classifiers in non-sparse setting...
research
02/24/2015

Phase Transitions for High Dimensional Clustering and Related Problems

Consider a two-class clustering problem where we observe X_i = ℓ_i μ + Z...
research
05/03/2020

High Dimensional Classification for Spatially Dependent Data with Application to Neuroimaging

Discriminating patients with Alzheimer's disease (AD) from healthy subje...
research
02/04/2019

A note on the geometry of the MAP partition in some Normal Bayesian Mixture Models

We investigate the geometry of the maximal a posteriori (MAP) partition ...

Please sign up or login with your details

Forgot password? Click here to reset