A unifying approach for doubly-robust ℓ_1 regularized estimation of causal contrasts

04/07/2019
by   Ezequiel Smucler, et al.
0

We consider inference about a scalar parameter under a non-parametric model based on a one-step estimator computed as a plug in estimator plus the empirical mean of an estimator of the parameter's influence function. We focus on a class of parameters that have influence function which depends on two infinite dimensional nuisance functions and such that the bias of the one-step estimator of the parameter of interest is the expectation of the product of the estimation errors of the two nuisance functions. Our class includes many important treatment effect contrasts of interest in causal inference and econometrics, such as ATE, ATT, an integrated causal contrast with a continuous treatment, and the mean of an outcome missing not at random. We propose estimators of the target parameter that entertain approximately sparse regression models for the nuisance functions allowing for the number of potential confounders to be even larger than the sample size. By employing sample splitting, cross-fitting and ℓ_1-regularized regression estimators of the nuisance functions based on objective functions whose directional derivatives agree with those of the parameter's influence function, we obtain estimators of the target parameter with two desirable robustness properties: (1) they are rate doubly-robust in that they are root-n consistent and asymptotically normal when both nuisance functions follow approximately sparse models, even if one function has a very non-sparse regression coefficient, so long as the other has a sufficiently sparse regression coefficient, and (2) they are model doubly-robust in that they are root-n consistent and asymptotically normal even if one of the nuisance functions does not follow an approximately sparse model so long as the other nuisance function follows an approximately sparse model with a sufficiently sparse regression coefficient.

READ FULL TEXT
research
04/07/2019

Characterization of parameters with a mixed bias property

In this article we characterize a class of parameters in large non-param...
research
12/27/2019

Minimax Semiparametric Learning With Approximate Sparsity

Many objects of interest can be expressed as a linear, mean square conti...
research
02/23/2018

Double/De-Biased Machine Learning Using Regularized Riesz Representers

We provide adaptive inference methods for linear functionals of sparse l...
research
05/19/2021

Multiply Robust Causal Mediation Analysis with Continuous Treatments

In many applications, researchers are interested in the direct and indir...
research
07/20/2023

Multiply Robust Estimator Circumvents Hyperparameter Tuning of Neural Network Models in Causal Inference

Estimation of the Average Treatment Effect (ATE) is often carried out in...
research
10/15/2020

On Multi-step Estimation of Delay for SDE

We consider the problem of delay estimation by the observations of the s...
research
01/04/2021

Regression Discontinuity Design with Many Thresholds

Numerous empirical studies employ regression discontinuity designs with ...

Please sign up or login with your details

Forgot password? Click here to reset