Continuum Limit of Lipschitz Learning on Graphs

12/07/2020
by   Tim Roith, et al.
0

Tackling semi-supervised learning problems with graph-based methods have become a trend in recent years since graphs can represent all kinds of data and provide a suitable framework for studying continuum limits, e.g., of differential operators. A popular strategy here is p-Laplacian learning, which poses a smoothness condition on the sought inference function on the set of unlabeled data. For p<∞ continuum limits of this approach were studied using tools from Γ-convergence. For the case p=∞, which is referred to as Lipschitz learning, continuum limits of the related infinity-Laplacian equation were studied using the concept of viscosity solutions. In this work, we prove continuum limits of Lipschitz learning using Γ-convergence. In particular, we define a sequence of functionals which approximate the largest local Lipschitz constant of a graph function and prove Γ-convergence in the L^∞-topology to the supremum norm of the gradient as the graph becomes denser. Furthermore, we show compactness of the functionals which implies convergence of minimizers. In our analysis we allow a varying set of labeled data which converges to a general closed set in the Hausdorff distance. We apply our results to nonlinear ground states and, as a by-product, prove convergence of graph distance functions to geodesic distance functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/24/2021

Uniform Convergence Rates for Lipschitz Learning on Graphs

Lipschitz learning is a graph-based semi-supervised learning method wher...
research
10/17/2022

Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian

In this paper we prove the first quantitative convergence rates for the ...
research
01/24/2020

A continuum limit for the PageRank algorithm

Semi-supervised and unsupervised machine learning methods often rely on ...
research
04/20/2018

Stochastic subgradient method converges on tame functions

This work considers the question: what convergence guarantees does the s...
research
05/23/2018

Large Data and Zero Noise Limits of Graph-Based Semi-Supervised Learning Algorithms

Scalings in which the graph Laplacian approaches a differential operator...
research
03/17/2017

On Consistency of Graph-based Semi-supervised Learning

Graph-based semi-supervised learning is one of the most popular methods ...
research
09/23/2020

Enhancing Mixup-based Semi-Supervised Learning with Explicit Lipschitz Regularization

The success of deep learning relies on the availability of large-scale a...

Please sign up or login with your details

Forgot password? Click here to reset