Sampling numbers of smoothness classes via ℓ^1-minimization

12/01/2022
by   Thomas Jahn, et al.
0

Using techniques developed recently in the field of compressed sensing we prove new upper bounds for general (non-linear) sampling numbers of (quasi-)Banach smoothness spaces in L^2. In relevant cases such as mixed and isotropic weighted Wiener classes or Sobolev spaces with mixed smoothness, sampling numbers in L^2 can be upper bounded by best n-term trigonometric widths in L^∞. We describe a recovery procedure based on ℓ^1-minimization (basis pursuit denoising) using only m function values with m close to n. With this method, a significant gain in the rate of convergence compared to recently developed linear recovery methods is achieved. In this deterministic worst-case setting we see an additional speed-up of n^-1/2 compared to linear methods in case of weighted Wiener spaces. For their quasi-Banach counterparts even arbitrary polynomial speed-up is possible. Surprisingly, our approach allows to recover mixed smoothness Sobolev functions belonging to S^r_pW(𝕋^d) on the d-torus with a logarithmically better error decay than any linear method can achieve when 1 < p < 2 and d is large. This effect is not present for isotropic Sobolev spaces.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro