Robust learning and complexity dependent bounds for regularized problems

02/06/2019
by   Geoffrey Chinot, et al.
0

We obtain risk bounds for Regularized Empirical Risk Minimizers (RERM) and minmax Median-Of-Means (MOM) estimators where the regularization function ϕ(·) is not necessary a norm. It covers for example the Support Vector Machine (SVM) and the Elastic Net procedures. We obtain bounds on the L_2 estimation error rate that depend on the complexity of the true model F^* = { f ∈ F: ϕ(f) ≤ϕ(f^*) }, where f^* is the minimizer of the risk over the class F. The estimators are based on loss functions that are both Lipschitz and convex. Results for the RERM are derived without assumptions on the outputs and under subgaussian assumptions on the design. Similar results are shown for minmax MOM estimators in a close setting where outliers may be present in the dataset and where the design is only supposed to satisfy moment assumptions, relaxing the subgaussian and the i.i.d hypothesis necessary for RERM. Unlike alternatives based on MOMs principle, the analysis of minmax MOM estimators is not based on the small ball assumption (SBA) of Koltchinskii Vladimir and Mendelson Shahar in Bounding the smallest singular value of a random matrix without concentration but on a weak local Bernstein Assumption.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset