DeepAI AI Chat
Log In Sign Up

Controlling False Discovery Rate Using Gaussian Mirrors

by   Xin Xing, et al.

Simultaneously finding multiple influential variables and controlling the false discovery rate (FDR) for linear regression models is a fundamental problem with a long history. We here propose the Gaussian Mirror (GM) method, which creates for each predictor variable a pair of mirror variables by adding and subtracting a randomly generated Gaussian random variable, and proceeds with a certain regression method, such as the ordinary least-square or the Lasso. The mirror variables naturally lead to a test statistic highly effective for controlling the FDR. Under a weak dependence assumption, we show that the FDR can be controlled at a user-specified level asymptotically. It is shown that the GM method is more powerful than many existing methods in selecting important variables, subject to the control of FDR especially under the case when high correlations among the covariates exist.


page 27

page 28


On the testing of multiple hypothesis in sliced inverse regression

We consider the multiple testing of the general regression framework aim...

Directional FDR Control for Sub-Gaussian Sparse GLMs

High-dimensional sparse generalized linear models (GLMs) have emerged in...

Improving knockoffs with conditional calibration

The knockoff filter of Barber and Candes (arXiv:1404.5609) is a flexible...

Whiteout: when do fixed-X knockoffs fail?

A core strength of knockoff methods is their virtually limitless customi...

Controlling for multiple covariates

A fundamental problem in statistics is to compare the outcomes attained ...

ECKO: Ensemble of Clustered Knockoffs for multivariate inference on fMRI data

Continuous improvement in medical imaging techniques allows the acquisit...