Consistent Non-Parametric Methods for Adaptive Robustness

02/18/2021
by   Robi Bhattacharjee, et al.
0

Learning classifiers that are robust to adversarial examples has received a great deal of recent attention. A major drawback of the standard robust learning framework is the imposition of an artificial robustness radius r that applies to all inputs, and ignores the fact that data may be highly heterogeneous. In this paper, we address this limitation by proposing a new framework for adaptive robustness, called neighborhood preserving robustness. We present sufficient conditions under which general non-parametric methods that can be represented as weight functions satisfy our notion of robustness, and show that both nearest neighbors and kernel classifiers satisfy these conditions in the large sample limit.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset