DeepAI AI Chat
Log In Sign Up

Optimal Recovery from Inaccurate Data in Hilbert Spaces: Regularize, but what of the Parameter?

11/04/2021
by   Simon Foucart, et al.
0

In Optimal Recovery, the task of learning a function from observational data is tackled deterministically by adopting a worst-case perspective tied to an explicit model assumption made on the functions to be learned. Working in the framework of Hilbert spaces, this article considers a model assumption based on approximability. It also incorporates observational inaccuracies modeled via additive errors bounded in ℓ_2. Earlier works have demonstrated that regularization provide algorithms that are optimal in this situation, but did not fully identify the desired hyperparameter. This article fills the gap in both a local scenario and a global scenario. In the local scenario, which amounts to the determination of Chebyshev centers, the semidefinite recipe of Beck and Eldar (legitimately valid in the complex setting only) is complemented by a more direct approach, with the proviso that the observational functionals have orthonormal representers. In the said approach, the desired parameter is the solution to an equation that can be resolved via standard methods. In the global scenario, where linear algorithms rule, the parameter elusive in the works of Micchelli et al. is found as the byproduct of a semidefinite program. Additionally and quite surprisingly, in case of observational functionals with orthonormal representers, it is established that any regularization parameter is optimal.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/05/2020

Learning from Non-IID Data in Hilbert Spaces: An Optimal Recovery Perspective

The notion of generalization in classical Statistical Learning is often ...
05/07/2019

On L_2-approximation in Hilbert spaces using function values

We study L_2-approximation of functions from Hilbert spaces H in which f...
11/03/2020

Function values are enough for L_2-approximation: Part II

In the first part we have shown that, for L_2-approximation of functions...
04/27/2020

Integration in reproducing kernel Hilbert spaces of Gaussian kernels

The Gaussian kernel plays a central role in machine learning, uncertaint...
09/12/2022

Full Recovery from Point Values: an Optimal Algorithm for Chebyshev Approximability Prior

Given pointwise samples of an unknown function belonging to a certain mo...
01/29/2019

Kernel embedded nonlinear observational mappings in the variational mapping particle filter

Recently, some works have suggested methods to combine variational proba...
04/01/2020

Instances of Computational Optimal Recovery: Refined Approximability Models

Models based on approximation capabilities have recently been studied in...