The Fundamental Limits of Structure-Agnostic Functional Estimation

05/06/2023
by   Sivaraman Balakrishnan, et al.
0

Many recent developments in causal inference, and functional estimation problems more generally, have been motivated by the fact that classical one-step (first-order) debiasing methods, or their more recent sample-split double machine-learning avatars, can outperform plugin estimators under surprisingly weak conditions. These first-order corrections improve on plugin estimators in a black-box fashion, and consequently are often used in conjunction with powerful off-the-shelf estimation methods. These first-order methods are however provably suboptimal in a minimax sense for functional estimation when the nuisance functions live in Holder-type function spaces. This suboptimality of first-order debiasing has motivated the development of "higher-order" debiasing methods. The resulting estimators are, in some cases, provably optimal over Holder-type spaces, but both the estimators which are minimax-optimal and their analyses are crucially tied to properties of the underlying function space. In this paper we investigate the fundamental limits of structure-agnostic functional estimation, where relatively weak conditions are placed on the underlying nuisance functions. We show that there is a strong sense in which existing first-order methods are optimal. We achieve this goal by providing a formalization of the problem of functional estimation with black-box nuisance function estimates, and deriving minimax lower bounds for this problem. Our results highlight some clear tradeoffs in functional estimation – if we wish to remain agnostic to the underlying nuisance function spaces, impose only high-level rate conditions, and maintain compatibility with black-box nuisance estimators then first-order methods are optimal. When we have an understanding of the structure of the underlying nuisance functions then carefully constructed higher-order estimators can outperform first-order estimators.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2022

On Undersmoothing and Sample Splitting for Estimating a Doubly Robust Functional

We consider the problem of constructing minimax rate-optimal estimators ...
research
07/08/2020

Minimax Efficient Finite-Difference Stochastic Gradient Estimators Using Black-Box Function Evaluations

We consider stochastic gradient estimation using noisy black-box functio...
research
08/29/2019

Minimax optimal estimators for general additive functional estimation

In this paper, we observe a sparse mean vector through Gaussian noise an...
research
03/02/2022

Minimax rates for heterogeneous causal effect estimation

Estimation of heterogeneous causal effects - i.e., how effects of polici...
research
11/06/2020

A Simple Algorithm for Higher-order Delaunay Mosaics and Alpha Shapes

We present a simple algorithm for computing higher-order Delaunay mosaic...
research
05/19/2021

Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization

We consider stochastic gradient estimation using only black-box function...
research
03/12/2022

Semiparametric doubly robust targeted double machine learning: a review

In this review we cover the basics of efficient nonparametric parameter ...

Please sign up or login with your details

Forgot password? Click here to reset