Near-Optimal Active Learning of Halfspaces via Query Synthesis in the Noisy Setting

03/11/2016
by   Lin Chen, et al.
0

In this paper, we consider the problem of actively learning a linear classifier through query synthesis where the learner can construct artificial queries in order to estimate the true decision boundaries. This problem has recently gained a lot of interest in automated science and adversarial reverse engineering for which only heuristic algorithms are known. In such applications, queries can be constructed de novo to elicit information (e.g., automated science) or to evade detection with minimal cost (e.g., adversarial reverse engineering). We develop a general framework, called dimension coupling (DC), that 1) reduces a d-dimensional learning problem to d-1 low dimensional sub-problems, 2) solves each sub-problem efficiently, 3) appropriately aggregates the results and outputs a linear classifier, and 4) provides a theoretical guarantee for all possible schemes of aggregation. The proposed method is proved resilient to noise. We show that the DC framework avoids the curse of dimensionality: its computational complexity scales linearly with the dimension. Moreover, we show that the query complexity of DC is near optimal (within a constant factor of the optimum algorithm). To further support our theoretical analysis, we compare the performance of DC with the existing work. We observe that DC consistently outperforms the prior arts in terms of query complexity while often running orders of magnitude faster.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/26/2020

Active Imitation Learning with Noisy Guidance

Imitation learning algorithms provide state-of-the-art results on many s...
research
07/02/2023

Revisiting the specification decomposition for synthesis based on LTL solvers

Recently, several algorithms have been proposed for decomposing reactive...
research
06/04/2021

Active Covering

We analyze the problem of active covering, where the learner is given an...
research
02/18/2017

Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

It has been a long-standing problem to efficiently learn a halfspace usi...
research
08/31/2020

Active Local Learning

In this work we consider active local learning: given a query point x, a...
research
05/18/2023

Difference of Submodular Minimization via DC Programming

Minimizing the difference of two submodular (DS) functions is a problem ...
research
11/06/2020

Revisiting Model-Agnostic Private Learning: Faster Rates and Active Learning

The Private Aggregation of Teacher Ensembles (PATE) framework is one of ...

Please sign up or login with your details

Forgot password? Click here to reset