Rates of Convergence for Laplacian Semi-Supervised Learning with Low Labeling Rates

06/04/2020
by   Jeff Calder, et al.
0

We study graph-based Laplacian semi-supervised learning at low labeling rates. Laplacian learning uses harmonic extension on a graph to propagate labels. At very low label rates, Laplacian learning becomes degenerate and the solution is roughly constant with spikes at each labeled data point. Previous work has shown that this degeneracy occurs when the number of labeled data points is finite while the number of unlabeled data points tends to infinity. In this work we allow the number of labeled data points to grow to infinity with the number of labels. Our results show that for a random geometric graph with length scale ε>0 and labeling rate β>0, if β≪ε^2 then the solution becomes degenerate and spikes form, and if β≫ε^2 then Laplacian learning is well-posed and consistent with a continuum Laplace equation. Furthermore, in the well-posed setting we prove quantitative error estimates of O(εβ^-1/2) for the difference between the solutions of the discrete problem and continuum PDE, up to logarithmic factors. We also study p-Laplacian regularization and show the same degeneracy result when β≪ε^p. The proofs of our well-posedness results use the random walk interpretation of Laplacian learning and PDE arguments, while the proofs of the ill-posedness results use Γ-convergence tools from the calculus of variations. We also present numerical results on synthetic and real data to illustrate our results.

READ FULL TEXT
research
10/10/2018

Properly-weighted graph Laplacian for semi-supervised learning

The performance of traditional graph Laplacian methods for semi-supervis...
research
01/29/2019

A maximum principle argument for the uniform convergence of graph Laplacian regressors

We study asymptotic consistency guarantees for a non-parametric regressi...
research
02/10/2022

A non-local gradient based approach of infinity Laplacian with Γ-convergence

We propose an infinity Laplacian method to address the problem of interp...
research
09/27/2020

Analysis of label noise in graph-based semi-supervised learning

In machine learning, one must acquire labels to help supervise a model t...
research
02/20/2019

Learning with Inadequate and Incorrect Supervision

Practically, we are often in the dilemma that the labeled data at hand a...
research
07/19/2017

Analysis of p-Laplacian Regularization in Semi-Supervised Learning

We investigate a family of regression problems in a semi-supervised sett...
research
03/14/2023

Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels

Laplace learning is a popular machine learning algorithm for finding mis...

Please sign up or login with your details

Forgot password? Click here to reset