Optimal Recovery from Inaccurate Data in Hilbert Spaces: Regularize, but what of the Parameter?

11/04/2021
by   Simon Foucart, et al.
0

In Optimal Recovery, the task of learning a function from observational data is tackled deterministically by adopting a worst-case perspective tied to an explicit model assumption made on the functions to be learned. Working in the framework of Hilbert spaces, this article considers a model assumption based on approximability. It also incorporates observational inaccuracies modeled via additive errors bounded in ℓ_2. Earlier works have demonstrated that regularization provide algorithms that are optimal in this situation, but did not fully identify the desired hyperparameter. This article fills the gap in both a local scenario and a global scenario. In the local scenario, which amounts to the determination of Chebyshev centers, the semidefinite recipe of Beck and Eldar (legitimately valid in the complex setting only) is complemented by a more direct approach, with the proviso that the observational functionals have orthonormal representers. In the said approach, the desired parameter is the solution to an equation that can be resolved via standard methods. In the global scenario, where linear algorithms rule, the parameter elusive in the works of Micchelli et al. is found as the byproduct of a semidefinite program. Additionally and quite surprisingly, in case of observational functionals with orthonormal representers, it is established that any regularization parameter is optimal.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/05/2020

Learning from Non-IID Data in Hilbert Spaces: An Optimal Recovery Perspective

The notion of generalization in classical Statistical Learning is often ...
research
04/02/2023

On the Optimal Recovery of Graph Signals

Learning a smooth graph signal from partially observed data is a well-st...
research
11/03/2020

Function values are enough for L_2-approximation: Part II

In the first part we have shown that, for L_2-approximation of functions...
research
04/27/2020

Integration in reproducing kernel Hilbert spaces of Gaussian kernels

The Gaussian kernel plays a central role in machine learning, uncertaint...
research
04/26/2023

Kernel Methods are Competitive for Operator Learning

We present a general kernel-based framework for learning operators betwe...
research
01/29/2019

Kernel embedded nonlinear observational mappings in the variational mapping particle filter

Recently, some works have suggested methods to combine variational proba...
research
04/01/2020

Instances of Computational Optimal Recovery: Refined Approximability Models

Models based on approximation capabilities have recently been studied in...

Please sign up or login with your details

Forgot password? Click here to reset