Active and passive learning of linear separators under log-concave distributions

11/06/2012
by   Maria-Florina Balcan, et al.
0

We provide new results concerning label efficient, polynomial time, passive and active learning of linear separators. We prove that active learning provides an exponential improvement over PAC (passive) learning of homogeneous linear separators under nearly log-concave distributions. Building on this, we provide a computationally efficient PAC algorithm with optimal (up to a constant factor) sample complexity for such problems. This resolves an open question concerning the sample complexity of efficient PAC algorithms under the uniform distribution in the unit ball. Moreover, it provides the first bound for a polynomial-time PAC algorithm that is tight for an interesting infinite class of hypothesis functions under a general and natural class of data-distributions, providing significant progress towards a longstanding open question. We also provide new bounds for active and passive learning in the case that the data might not be linearly separable, both in the agnostic case and and under the Tsybakov low-noise condition. To derive our results, we provide new structural results for (nearly) log-concave distributions, which might be of independent interest as well.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/22/2017

S-Concave Distributions: Towards Broader Distributions for Noise-Tolerant and Sample-Efficient Learning Algorithms

We provide new results concerning noise-tolerant and sample-efficient le...
research
07/31/2013

The Power of Localization for Efficiently Learning Linear Separators with Noise

We introduce a new approach for designing computationally efficient lear...
research
12/19/2020

On the Power of Localized Perceptron for Label-Optimal Learning of Halfspaces with Adversarial Noise

We study online active learning of homogeneous halfspaces in ℝ^d with ad...
research
02/12/2020

Efficient active learning of sparse halfspaces with arbitrary bounded noise

In this work we study active learning of homogeneous s-sparse halfspaces...
research
06/17/2022

Learning a Single Neuron with Adversarial Label Noise via Gradient Descent

We study the fundamental problem of learning a single neuron, i.e., a fu...
research
06/20/2014

Noise-adaptive Margin-based Active Learning and Lower Bounds under Tsybakov Noise Condition

We present a simple noise-robust margin-based active learning algorithm ...
research
07/30/2020

The Complexity of Adversarially Robust Proper Learning of Halfspaces with Agnostic Noise

We study the computational complexity of adversarially robust proper lea...

Please sign up or login with your details

Forgot password? Click here to reset