Efficient active learning of sparse halfspaces with arbitrary bounded noise

02/12/2020
by   Chicheng Zhang, et al.
11

In this work we study active learning of homogeneous s-sparse halfspaces in R^d under label noise. Even in the absence of label noise this is a challenging problem and only recently have label complexity bounds of the form Õ(s ·polylog(d, 1/ϵ) ) been established in <cit.> for computationally efficient algorithms under the broad class of isotropic log-concave distributions. In contrast, under high levels of label noise, the label complexity bounds achieved by computationally efficient algorithms are much worse. When the label noise satisfies the Massart condition <cit.>, i.e., each label is flipped with probability at most η for a parameter η∈ [0,1/2), the work of <cit.> provides a computationally efficient active learning algorithm under isotropic log-concave distributions with label complexity Õ(s^poly(1/(1-2η))poly(log d, 1/ϵ) ). Hence the algorithm is label-efficient only when the noise rate η is a constant. In this work, we substantially improve on the state of the art by designing a polynomial time algorithm for active learning of s-sparse halfspaces under bounded noise and isotropic log-concave distributions, with a label complexity of Õ(s/(1-2η)^4polylog (d, 1/ϵ) ). Hence, our new algorithm is label-efficient even for noise rates close to 1/2. Prior to our work, such a result was not known even for the random classification noise model. Our algorithm builds upon existing margin-based algorithmic framework and at each iteration performs a sequence of online mirror descent updates on a carefully chosen loss sequence, and uses a novel gradient update rule that accounts for the bounded noise.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset