Learning When Training Data are Costly: The Effect of Class Distribution on Tree Induction

06/22/2011
by   F. Provost, et al.
0

For large, real-world inductive learning problems, the number of training examples often must be limited due to the costs associated with procuring, preparing, and storing the training examples and/or the computational costs associated with learning from them. In such circumstances, one question of practical importance is: if only n training examples can be selected, in what proportion should the classes be represented? In this article we help to answer this question by analyzing, for a fixed training-set size, the relationship between the class distribution of the training data and the performance of classification trees induced from these data. We study twenty-six data sets and, for each, determine the best class distribution for learning. The naturally occurring class distribution is shown to generally perform well when classifier performance is evaluated using undifferentiated error rate (0/1 loss). However, when the area under the ROC curve is used to evaluate classifier performance, a balanced distribution is shown to perform well. Since neither of these choices for class distribution always generates the best-performing classifier, we introduce a budget-sensitive progressive sampling algorithm for selecting training examples based on the class associated with each example. An empirical analysis of this algorithm shows that the class distribution of the resulting training set yields classifiers with good (nearly-optimal) classification performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/20/2022

Careful Data Curation Stabilizes In-context Learning

In-context learning (ICL) enables large language models (LLMs) to perfor...
research
07/02/2016

Rademacher Complexity Bounds for a Penalized Multiclass Semi-Supervised Algorithm

We propose Rademacher complexity bounds for multiclass classifiers train...
research
11/15/2018

Exploiting Class Learnability in Noisy Data

In many domains, collecting sufficient labeled training data for supervi...
research
08/31/2012

Statistically adaptive learning for a general class of cost functions (SA L-BFGS)

We present a system that enables rapid model experimentation for tera-sc...
research
11/25/2013

Are all training examples equally valuable?

When learning a new concept, not all training examples may prove equally...
research
03/05/2023

On the Capacity Limits of Privileged ERM

We study the supervised learning paradigm called Learning Using Privileg...
research
03/03/2019

Classification via local manifold approximation

Classifiers label data as belonging to one of a set of groups based on i...

Please sign up or login with your details

Forgot password? Click here to reset