Worst case recovery guarantees for least squares approximation using random samples
We consider a least squares regression algorithm for the recovery of complex-valued functions belonging to a reproducing kernel Hilbert space H(K) from random data measuring the error in L_2(D,ϱ_D). We prove worst-case recovery guarantees with high probability and improve on the recent new upper bounds by Krieg, M. Ullrich for general sampling numbers by explicitly controlling all the involved constants with respect to the underlying spatial dimension d. This leads to new preasymptotic recovery bounds with high probability for the error of Hyperbolic Fourier Regression for multivariate functions. In addition, we analyze the algorithm Hyperbolic Wavelet Regression also based on least-squares to recover non-periodic functions from random samples. As a further application we reconsider the analysis of a cubature method based on plain random points with optimal weights introduced by Oettershagen in 2017. We confirm a conjecture (which was based on various numerical experiments) and give improved near-optimal worst-case error bounds with high probability. It turns out that this simple method can compete with the quasi-Monte Carlo methods in the literature which are based on lattices and digital nets. Last but not least, we contribute new preasymptotic bounds to the problem of the recovery of individual functions from n samples which has been already considered by Bohn, Griebel; Cohen, Davenport, Leviatan; Chkifa, Migliorati, Nobile, Tempone; Cohen, Migliorati; Krieg and several others.
READ FULL TEXT