Optimal approximation of infinite-dimensional holomorphic functions

05/29/2023
by   Ben Adcock, et al.
0

Over the last decade, approximating functions in infinite dimensions from samples has gained increasing attention in computational science and engineering, especially in computational uncertainty quantification. This is primarily due to the relevance of functions that are solutions to parametric differential equations in various fields, e.g. chemistry, economics, engineering, and physics. While acquiring accurate and reliable approximations of such functions is inherently difficult, current benchmark methods exploit the fact that such functions often belong to certain classes of holomorphic functions to get algebraic convergence rates in infinite dimensions with respect to the number of (potentially adaptive) samples m. Our work focuses on providing theoretical approximation guarantees for the class of (b,ε)-holomorphic functions, demonstrating that these algebraic rates are the best possible for Banach-valued functions in infinite dimensions. We establish lower bounds using a reduction to a discrete problem in combination with the theory of m-widths, Gelfand widths and Kolmogorov widths. We study two cases, known and unknown anisotropy, in which the relative importance of the variables is known and unknown, respectively. A key conclusion of our paper is that in the latter setting, approximation from finite samples is impossible without some inherent ordering of the variables, even if the samples are chosen adaptively. Finally, in both cases, we demonstrate near-optimal, non-adaptive (random) sampling and recovery strategies which achieve close to same rates as the lower bounds.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2022

Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks

The past decade has seen increasing interest in applying Deep Learning (...
research
03/25/2022

On efficient algorithms for computing near-best polynomial approximations to high-dimensional, Hilbert-valued functions from limited samples

Sparse polynomial approximation has become indispensable for approximati...
research
07/09/2019

Convergence Rates for Gaussian Mixtures of Experts

We provide a theoretical treatment of over-specified Gaussian mixtures o...
research
08/18/2022

Is Monte Carlo a bad sampling strategy for learning smooth functions in high dimensions?

This paper concerns the approximation of smooth, high-dimensional functi...
research
02/04/2022

Towards optimal sampling for learning sparse approximation in high dimensions

In this chapter, we discuss recent work on learning sparse approximation...
research
01/31/2022

An Adaptive sampling and domain learning strategy for multivariate function approximation on unknown domains

Many problems in computational science and engineering can be described ...
research
07/10/2020

Adaptive reconstruction of imperfectly-observed monotone functions, with applications to uncertainty quantification

Motivated by the desire to numerically calculate rigorous upper and lowe...

Please sign up or login with your details

Forgot password? Click here to reset