Adaptive approximation of monotone functions

09/14/2023
by   Pierre Gaillard, et al.
0

We study the classical problem of approximating a non-decreasing function f: 𝒳→𝒴 in L^p(μ) norm by sequentially querying its values, for known compact real intervals 𝒳, 𝒴 and a known probability measure μ on . For any function f we characterize the minimum number of evaluations of f that algorithms need to guarantee an approximation f̂ with an L^p(μ) error below ϵ after stopping. Unlike worst-case results that hold uniformly over all f, our complexity measure is dependent on each specific function f. To address this problem, we introduce GreedyBox, a generalization of an algorithm originally proposed by Novak (1992) for numerical integration. We prove that GreedyBox achieves an optimal sample complexity for any function f, up to logarithmic factors. Additionally, we uncover results regarding piecewise-smooth functions. Perhaps as expected, the L^p(μ) error of GreedyBox decreases much faster for piecewise-C^2 functions than predicted by the algorithm (without any knowledge on the smoothness of f). A simple modification even achieves optimal minimax approximation rates for such functions, which we compute explicitly. In particular, our findings highlight multiple performance gaps between adaptive and non-adaptive algorithms, smooth and piecewise-smooth functions, as well as monotone or non-monotone functions. Finally, we provide numerical experiments to support our theoretical results.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset