Target-Independent Active Learning via Distribution-Splitting

09/28/2018
by   Xiaofeng Cao, et al.
0

To reduce the label complexity in Agnostic Active Learning (A^2 algorithm), volume-splitting splits the hypothesis edges to reduce the Vapnik-Chervonenkis (VC) dimension in version space. However, the effectiveness of volume-splitting critically depends on the initial hypothesis and this problem is also known as target-dependent label complexity gap. This paper attempts to minimize this gap by introducing a novel notion of number density which provides a more natural and direct way to describe the hypothesis distribution than volume. By discovering the connections between hypothesis and input distribution, we map the volume of version space into the number density and propose a target-independent distribution-splitting strategy with the following advantages: 1) provide theoretical guarantees on reducing label complexity and error rate as volume-splitting; 2) break the curse of initial hypothesis; 3) provide model guidance for a target-independent AL algorithm in real AL tasks. With these guarantees, for AL application, we then split the input distribution into more near-optimal spheres and develop an application algorithm called Distribution-based A^2 (DA^2). Experiments further verify the effectiveness of the halving and querying abilities of DA^2. Contributions of this paper are as follows.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/15/2022

Active Learning with Neural Networks: Insights from Nonparametric Statistics

Deep neural networks have great representation power, but typically requ...
research
07/24/2018

A Structured Perspective of Volumes on Active Learning

Active Learning (AL) is a learning task that requires learners interacti...
research
06/02/2023

Agnostic Multi-Group Active Learning

Inspired by the problem of improving classification accuracy on rare or ...
research
06/29/2015

S2: An Efficient Graph Based Active Learning Algorithm with Application to Nonparametric Classification

This paper investigates the problem of active learning for binary label ...
research
10/15/2010

Near-Optimal Bayesian Active Learning with Noisy Observations

We tackle the fundamental problem of Bayesian active learning with noise...
research
08/31/2020

Active Local Learning

In this work we consider active local learning: given a query point x, a...
research
05/25/2021

Optimal Sampling Density for Nonparametric Regression

We propose a novel active learning strategy for regression, which is mod...

Please sign up or login with your details

Forgot password? Click here to reset