Kernel Debiased Plug-in Estimation

06/14/2023
by   Brian Cho, et al.
0

We consider the problem of estimating a scalar target parameter in the presence of nuisance parameters. Replacing the unknown nuisance parameter with a nonparametric estimator, e.g.,a machine learning (ML) model, is convenient but has shown to be inefficient due to large biases. Modern methods, such as the targeted minimum loss-based estimation (TMLE) and double machine learning (DML), achieve optimal performance under flexible assumptions by harnessing ML estimates while mitigating the plug-in bias. To avoid a sub-optimal bias-variance trade-off, these methods perform a debiasing step of the plug-in pre-estimate. Existing debiasing methods require the influence function of the target parameter as input. However, deriving the IF requires specialized expertise and thus obstructs the adaptation of these methods by practitioners. We propose a novel way to debias plug-in estimators which (i) is efficient, (ii) does not require the IF to be implemented, (iii) is computationally tractable, and therefore can be readily adapted to new estimation problems and automated without analytic derivations by the user. We build on the TMLE framework and update a plug-in estimate with a regularized likelihood maximization step over a nonparametric model constructed with a reproducing kernel Hilbert space (RKHS), producing an efficient plug-in estimate for any regular target parameter. Our method, thus, offers the efficiency of competing debiasing techniques without sacrificing the utility of the plug-in approach.

READ FULL TEXT
research
07/30/2016

Double/Debiased Machine Learning for Treatment and Causal Parameters

Most modern supervised statistical/machine learning (ML) methods are exp...
research
03/29/2023

One-Step Estimation of Differentiable Hilbert-Valued Parameters

We present estimators for smooth Hilbert-valued parameters, where smooth...
research
03/12/2022

Semiparametric doubly robust targeted double machine learning: a review

In this review we cover the basics of efficient nonparametric parameter ...
research
03/04/2020

Universal sieve-based strategies for efficient estimation using machine learning tools

Suppose that we wish to estimate a finite-dimensional summary of one or ...
research
11/03/2018

Canonical Least Favorable Submodels:A New TMLE Procedure for Multidimensional Parameters

This paper is a fundamental addition to the world of targeted maximum li...
research
06/13/2023

Nonparametric inference on non-negative dissimilarity measures at the boundary of the parameter space

It is often of interest to assess whether a function-valued statistical ...
research
06/07/2023

Inferring unknown unknowns: Regularized bias-aware ensemble Kalman filter

Because of physical assumptions and numerical approximations, reduced-or...

Please sign up or login with your details

Forgot password? Click here to reset