Error Analysis on Graph Laplacian Regularized Estimator

02/11/2019
by   Kaige Yang, et al.
0

We provide a theoretical analysis of the representation learning problem aimed at learning the latent variables (design matrix) Θ of observations Y with the knowledge of the coefficient matrix X. The design matrix is learned under the assumption that the latent variables Θ are smooth with respect to a (known) topological structure G. To learn such latent variables, we study a graph Laplacian regularized estimator, which is the penalized least squares estimator with penalty term proportional to a Laplacian quadratic form. This type of estimators has recently received considerable attention due to its capability in incorporating underlying topological graph structure of variables into the learning process. While the estimation problem can be solved efficiently by state-of-the-art optimization techniques, its statistical consistency properties have been largely overlooked. In this work, we develop a non-asymptotic bound of estimation error under the classical statistical setting, where sample size is larger than the ambient dimension of the latent variables. This bound illustrates theoretically the impact of the alignment between the data and the graph structure as well as the graph spectrum on the estimation accuracy. It also provides theoretical evidence of the advantage, in terms of convergence rate, of the graph Laplacian regularized estimator over classical ones (that ignore the graph structure) in case of a smoothness prior. Finally, we provide empirical results of the estimation error to corroborate the theoretical analysis.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2021

Joint Network Topology Inference via Structured Fusion Regularization

Joint network topology inference represents a canonical problem of joint...
research
07/12/2019

Laplacian-regularized graph bandits: Algorithms and theoretical analysis

We study contextual multi-armed bandit problems in the case of multiple ...
research
04/10/2012

Asymptotic Accuracy of Distribution-Based Estimation for Latent Variables

Hierarchical statistical models are widely employed in information scien...
research
02/08/2021

Ising Model Selection Using ℓ_1-Regularized Linear Regression

We theoretically investigate the performance of ℓ_1-regularized linear r...
research
10/08/2011

Regularized Laplacian Estimation and Fast Eigenvector Approximation

Recently, Mahoney and Orecchia demonstrated that popular diffusion-based...
research
07/26/2021

Robust Regularized Locality Preserving Indexing for Fiedler Vector Estimation

The Fiedler vector of a connected graph is the eigenvector associated wi...
research
09/10/2020

Error analysis for denoising smooth modulo signals on a graph

In many applications, we are given access to noisy modulo samples of a s...

Please sign up or login with your details

Forgot password? Click here to reset