Large scale analysis of generalization error in learning using margin based classification methods
Large-margin classifiers are popular methods for classification. We derive the asymptotic expression for the generalization error of a family of large-margin classifiers in the limit of both sample size n and dimension p going to ∞ with fixed ratio α=n/p. This family covers a broad range of commonly used classifiers including support vector machine, distance weighted discrimination, and penalized logistic regression. Our result can be used to establish the phase transition boundary for the separability of two classes. We assume that the data are generated from a single multivariate Gaussian distribution with arbitrary covariance structure. We explore two special choices for the covariance matrix: spiked population model and two layer neural networks with random first layer weights. The method we used for deriving the closed-form expression is from statistical physics known as the replica method. Our asymptotic results match simulations already when n,p are of the order of a few hundreds. For two layer neural networks, we reproduce the recently developed `double descent' phenomenology for several classification models. We also discuss some statistical insights that can be drawn from these analysis.
READ FULL TEXT