The Peaking Phenomenon in Semi-supervised Learning

10/17/2016
by   Jesse H. Krijthe, et al.
0

For the supervised least squares classifier, when the number of training objects is smaller than the dimensionality of the data, adding more data to the training set may first increase the error rate before decreasing it. This, possibly counterintuitive, phenomenon is known as peaking. In this work, we observe that a similar but more pronounced version of this phenomenon also occurs in the semi-supervised setting, where instead of labeled objects, unlabeled objects are added to the training set. We explain why the learning curve has a more steep incline and a more gradual decline in this setting through simulation studies and by applying an approximation of the learning curve based on the work by Raudys & Duin.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/24/2023

Improving Open-Set Semi-Supervised Learning with Self-Supervision

Open-set semi-supervised learning (OSSL) is a realistic setting of semi-...
research
09/14/2020

Fairness Constraints in Semi-supervised Learning

Fairness in machine learning has received considerable attention. Howeve...
research
01/05/2022

Debiased Learning from Naturally Imbalanced Pseudo-Labels for Zero-Shot and Semi-Supervised Learning

This work studies the bias issue of pseudo-labeling, a natural phenomeno...
research
12/27/2015

Robust Semi-supervised Least Squares Classification by Implicit Constraints

We introduce the implicitly constrained least squares (ICLS) classifier,...
research
10/12/2016

Optimistic Semi-supervised Least Squares Classification

The goal of semi-supervised learning is to improve supervised classifier...
research
03/01/2015

Contrastive Pessimistic Likelihood Estimation for Semi-Supervised Classification

Improvement guarantees for semi-supervised classifiers can currently onl...

Please sign up or login with your details

Forgot password? Click here to reset