Optimal Rates for Spectral-regularized Algorithms with Least-Squares Regression over Hilbert Spaces

01/20/2018
by   Junhong Lin, et al.
0

In this paper, we study regression problems over a separable Hilbert space with the square loss, covering non-parametric regression over a reproducing kernel Hilbert space. We investigate a class of spectral-regularized algorithms, including ridge regression, principal component analysis, and gradient methods. We prove optimal, high-probability convergence results in terms of variants of norms for the studied algorithms, considering a capacity assumption on the hypothesis space and a general source condition on the target function. Consequently, we obtain almost sure convergence results with optimal rates. Our results improve and generalize previous results, filling a theoretical gap for the non-attainable cases.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset