Consistency of Fractional Graph-Laplacian Regularization in Semi-Supervised Learning with Finite Labels
Laplace learning is a popular machine learning algorithm for finding missing labels from a small number of labelled feature vectors using the geometry of a graph. More precisely, Laplace learning is based on minimising a graph-Dirichlet energy, equivalently a discrete Sobolev H^1 semi-norm, constrained to taking the values of known labels on a given subset. The variational problem is asymptotically ill-posed as the number of unlabeled feature vectors goes to infinity for finite given labels due to a lack of regularity in minimisers of the continuum Dirichlet energy in any dimension higher than one. In particular, continuum minimisers are not continuous. One solution is to consider higher-order regularisation, which is the analogue of minimising Sobolev H^s semi-norms. In this paper we consider the asymptotics of minimising a graph variant of the Sobolev H^s semi-norm with pointwise constraints. We show that, as expected, one needs s>d/2 where d is the dimension of the data manifold. We also show that there must be a upper bound on the connectivity of the graph; that is, highly connected graphs lead to degenerate behaviour of the minimiser even when s>d/2.
READ FULL TEXT