Learning rates for partially linear support vector machine in high dimensions
This paper analyzes a new regularized learning scheme for high dimensional partially linear support vector machine. The proposed approach consists of an empirical risk and the Lasso-type penalty for linear part, as well as the standard functional norm for nonlinear part. Here the linear kernel is used for model interpretation and feature selection, while the nonlinear kernel is adopted to enhance algorithmic flexibility. In this paper, we develop a new technical analysis on the weighted empirical process, and establish the sharp learning rates for the semi-parametric estimator under the regularized conditions. Specially, our derived learning rates for semi-parametric SVM depend on not only the sample size and the functional complexity, but also the sparsity and the margin parameters.
READ FULL TEXT