DeepAI AI Chat
Log In Sign Up

Minimax Linear Estimation of the Retargeted Mean

by   David A. Hirshberg, et al.

Weighting methods that adjust for observed covariates, such as inverse probability weighting, are widely used for causal inference and estimation with incomplete outcome data. These methods are appealingly interpretable: we observe an outcome of interest on one `observation' sample and wish to know its average over another `target' sample, so we weight the observation sample so it looks like the target and calculate the weighted average outcome. In this paper, we discuss a minimax linear estimator for this retargeted mean estimation problem. It is a weighting estimator that optimizes for similarity, as stochastic processes acting on a class of smooth functions, between the empirical measure of the covariates on the weighted observation sample and that on the target sample. This approach generalizes methods that optimize for similarity of the covariate means or marginal distributions, which correspond to taking this class of functions to be linear or additive respectively. Focusing on the case that this class is the unit ball of a reproducing kernel Hilbert space, we show that the minimax linear estimator is semiparametrically efficient under weak conditions; establish bounds attesting to the estimator's good finite sample properties; and observe promising performance on simulated data throughout a wide range of sample sizes, noise levels, and levels of overlap between the covariate distributions for the observation and target samples.


Kernel Mean Estimation by Marginalized Corrupted Distributions

Estimating the kernel mean in a reproducing kernel Hilbert space is a cr...

Covariate balancing using the integral probability metric for causal inference

Weighting methods in causal inference have been widely used to achieve a...

Integrative analysis of randomized clinical trials with real world evidence studies

We leverage the complementing features of randomized clinical trials (RC...

Double-Robust Estimation in Difference-in-Differences with an Application to Traffic Safety Evaluation

Difference-in-differences (DID) is a widely used approach for drawing ca...

Covariate-Powered Empirical Bayes Estimation

We study methods for simultaneous analysis of many noisy experiments in ...

Kpop: A kernel balancing approach for reducing specification assumptions in survey weighting

With the precipitous decline in response rates, researchers and pollster...