Optimal rates for F-score binary classification

05/10/2019
by   Evgenii Chzhen, et al.
0

We study the minimax settings of binary classification with F-score under the β-smoothness assumptions on the regression function η(x) = P(Y = 1|X = x) for x ∈R^d. We propose a classification procedure which under the α-margin assumption achieves the rate O(n^--(1+α)β/(2β+d)) for the excess F-score. In this context, the Bayes optimal classifier for the F-score can be obtained by thresholding the aforementioned regression function η on some level θ^* to be estimated. The proposed procedure is performed in a semi-supervised manner, that is, for the estimation of the regression function we use a labeled dataset of size n ∈N and for the estimation of the optimal threshold θ^* we use an unlabeled dataset of size N ∈N. Interestingly, the value of N ∈N does not affect the rate of convergence, which indicates that it is "harder" to estimate the regression function η than the optimal threshold θ^*. This further implies that the binary classification with F-score behaves similarly to the standard settings of binary classification. Finally, we show that the rates achieved by the proposed procedure are optimal in the minimax sense up to a constant factor.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset