Convergence bounds for empirical nonlinear least-squares

01/02/2020
by   Martin Eigel, et al.
0

We consider best approximation problems in a (nonlinear) subspace M of a Banach space (V,∙) where only an empirical estimate ∙_n of the norm can be computed. The norm is assumed to be of the form v := E_Y[|v|_Y^2]^1/2 for some (parametric) seminorm |∙|_Y depending on a random variable Y. The objective is to approximate an unknown function u ∈V by v∈M by minimizing the empirical norm u-v_n^2 := 1n∑_i=1^n |u-v|_y_i^2 w.r.t. n random samples {y_i}_i=1,...,n. It is well-known that such least squares approximations can become inaccurate and unstable when the number of samples n is too close to the number of parameters m ∝dim(M). We review this statement in the light of adapted distributions for the samples y_i and establish error bounds of the empirical best approximation error based on a restricted isometry property (RIP) (1-δ)v^2 <v_n^2 < (1+δ)v^2 ∀ v∈M which holds in probability. These results are closely related to those in "Optimal weighted least-squares methods" (A. Cohen and G. Migliorati, 2016) and show that n > sm is sufficient for the RIP to be satisfied with high probability. The factor s represents the variation of the empirical norm ∙_n on M. choice of the distribution of the samples. Several model classes are examined and numerical experiments illustrate some of the obtained stability bounds.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset