Optimal Imperfect Classification for Gaussian Functional Data
Existing works on functional data classification focus on the construction of classifiers that achieve perfect classification in the sense that classification risk converges to zero asymptotically. In practical applications, perfect classification is often impossible since the optimal Bayes classifier may have asymptotically nonzero risk. Such a phenomenon is called as imperfect classification. In the case of Gaussian functional data, we exploit classification problem in imperfect classification scenario. Sharp convergence rates for minimax excess risk are derived when data functions are either fully observed or discretely observed. Easily implementable classifiers based on discriminant analysis are proposed which are proven to achieve minimax optimality. In discretely observed case, we discover a critical sampling frequency that governs the sharp convergence rates. The proposed classifiers perform favorably in finite-sample applications, as we demonstrate through comparisons with other functional classifiers in simulations and one real data application.
READ FULL TEXT