Optimal post-selection inference for sparse signals: a nonparametric empirical-Bayes approach

10/25/2018
by   Spencer Woody, et al.
0

A large body of recent Bayesian work has focused on the question of how to find sparse signals. Much less work, however, has been done on the natural follow-up question: how to make valid inferences for the magnitude of those signals once they've been found. Ordinary Bayesian credible intervals are not necessarily appropriate for this task: in many circumstances, they suffer from selection bias, owing to the fact that the target of inference is chosen adaptively. There are many purely frequentist proposals for addressing this problem. But these typically require sacrificing the benefits of shrinkage or `borrowing strength' inherent to Bayesian modeling, resulting in confidence intervals that are needlessly wide. On the flip side, there are also Bayesian proposals for addressing this problem, most notably that of Yekutieli (2012), who constructs selection-adjusted posterior distributions. The resulting credible intervals, however, have poor frequentist performance: for nearly all values of the underlying parameter, they fail to exhibit the correct nominal coverage. Thus there is an unmet need for approaches to inference that correctly adjust for selection, and incorporate the benefits of shrinkage while maintaining exact frequentist coverage. We address this gap by proposing a nonparametric empirical-Bayes approach for constructing optimal selection-adjusted confidence sets. The method produces confidence sets that are as short as possible, while both adjusting for selection and maintaining exact frequentist coverage uniformly across the whole parameter space. Across a series of examples, the method outperforms existing frequentist techniques for post-selection inference, producing confidence sets that are notably shorter but with the same coverage guarantee.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset