Double Data Piling for Heterogeneous Covariance Models

11/28/2022
by   Taehyun Kim, et al.
0

In this work, we characterize two data piling phenomenon for a high-dimensional binary classification problem with heterogeneous covariance models. The data piling refers to the phenomenon where projections of the training data onto a direction vector have exactly two distinct values, one for each class. This first data piling phenomenon occurs for any data when the dimension p is larger than the sample size n. We show that the second data piling phenomenon, which refers to a data piling of independent test data, can occur in an asymptotic context where p grows while n is fixed. We further show that a second maximal data piling direction, which gives an asymptotic maximal distance between the two piles of independent test data, can be obtained by projecting the first maximal data piling direction onto the nullspace of the common leading eigenspace. This observation provides a theoretical explanation for the phenomenon where the optimal ridge parameter can be negative in the context of high-dimensional linear classification. Based on the second data piling phenomenon, we propose various linear classification rules which ensure perfect classification of high-dimension low-sample-size data under generalized heterogeneous spiked covariance models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2021

Optimal Linear Classification via Eigenvalue Shrinkage: The Case of Additive Noise

In this paper, we consider the general problem of testing the mean of tw...
research
09/30/2019

A New Framework for Distance and Kernel-based Metrics in High Dimensions

The paper presents new metrics to quantify and test for (i) the equality...
research
06/21/2020

The classification for High-dimension low-sample size data

Huge amount of applications in various fields, such as gene expression a...
research
12/01/2022

High Dimensional Binary Classification under Label Shift: Phase Transition and Regularization

Label Shift has been widely believed to be harmful to the generalization...
research
03/30/2020

Regularization in High-Dimensional Regression and Classification via Random Matrix Theory

We study general singular value shrinkage estimators in high-dimensional...
research
01/10/2020

Data-Dependence of Plateau Phenomenon in Learning with Neural Network — Statistical Mechanical Analysis

The plateau phenomenon, wherein the loss value stops decreasing during t...
research
11/09/2021

Harmless interpolation in regression and classification with structured features

Overparametrized neural networks tend to perfectly fit noisy training da...

Please sign up or login with your details

Forgot password? Click here to reset