Gradient Descent in RKHS with Importance Labeling

06/19/2020
by   Tomoya Murata, et al.
15

Labeling cost is often expensive and is a fundamental limitation of supervised learning. In this paper, we study importance labeling problem, in which we are given many unlabeled data and select a limited number of data to be labeled from the unlabeled data, and then a learning algorithm is executed on the selected one. We propose a new importance labeling scheme and analyse the generalization error of gradient descent combined with our labeling scheme in least squares regression in Reproducing Kernel Hilbert Spaces (RKHS). We show that the proposed importance labeling leads to much better generalization ability than uniform one under near interpolation settings. Numerical experiments verify our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2018

Classification from Pairwise Similarity and Unlabeled Data

One of the biggest bottlenecks in supervised learning is its high labeli...
research
01/16/2020

Curriculum Labeling: Self-paced Pseudo-Labeling for Semi-Supervised Learning

Semi-supervised learning aims to take advantage of a large amount of unl...
research
05/01/2022

Ridgeless Regression with Random Features

Recent theoretical studies illustrated that kernel ridgeless regression ...
research
09/10/2018

Beyond the Selected Completely At Random Assumption for Learning from Positive and Unlabeled Data

Most positive and unlabeled data is subject to selection biases. The lab...
research
03/15/2012

Parameter-Free Spectral Kernel Learning

Due to the growing ubiquity of unlabeled data, learning with unlabeled d...
research
02/09/2023

Domain Generalization by Functional Regression

The problem of domain generalization is to learn, given data from differ...
research
10/17/2021

Explaining generalization in deep learning: progress and fundamental limits

This dissertation studies a fundamental open challenge in deep learning ...

Please sign up or login with your details

Forgot password? Click here to reset