DeepAI
Log In Sign Up

Distributed Nonparametric Function Estimation: Optimal Rate of Convergence and Cost of Adaptation

07/01/2021
by   T. Tony Cai, et al.
2

Distributed minimax estimation and distributed adaptive estimation under communication constraints for Gaussian sequence model and white noise model are studied. The minimax rate of convergence for distributed estimation over a given Besov class, which serves as a benchmark for the cost of adaptation, is established. We then quantify the exact communication cost for adaptation and construct an optimally adaptive procedure for distributed estimation over a range of Besov classes. The results demonstrate significant differences between nonparametric function estimation in the distributed setting and the conventional centralized setting. For global estimation, adaptation in general cannot be achieved for free in the distributed setting. The new technical tools to obtain the exact characterization for the cost of adaptation can be of independent interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/03/2018

Adaptive distributed methods under communication constraints

We study distributed estimation methods under communication constraints ...
03/28/2020

Distributed function estimation: adaptation using minimal communication

We investigate whether in a distributed setting, adaptive estimation of ...
11/08/2017

An asymptotic analysis of distributed nonparametric methods

We investigate and compare the fundamental performance of several distri...
04/21/2022

Distributed Nonparametric Estimation under Communication Constraints

In the era of big data, it is necessary to split extremely large data se...
04/16/2021

Estimation of the Global Mode of a Density: Minimaxity, Adaptation, and Computational Complexity

We consider the estimation of the global mode of a density under some de...
12/21/2017

Density Estimation with Contaminated Data: Minimax Rates and Theory of Adaptation

This paper studies density estimation under pointwise loss in the settin...