Aggregation of Multiple Knockoffs

02/21/2020
by   Binh T. Nguyen, et al.
9

We develop an extension of the Knockoff Inference procedure, introduced by Barber and Candes (2015). This new method, called Aggregation of Multiple Knockoffs (AKO), addresses the instability inherent to the random nature of Knockoff-based inference. Specifically, AKO improves both the stability and power compared with the original Knockoff algorithm while still maintaining guarantees for False Discovery Rate control. We provide a new inference procedure, prove its core properties, and demonstrate its benefits in a set of experiments on synthetic and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/08/2019

Aggregated False Discovery Rate Control

We propose an aggregation scheme for methods that control the false disc...
research
07/21/2020

ADAGES: adaptive aggregation with stability for distributed feature selection

In this era of "big" data, not only the large amount of data keeps motiv...
research
05/30/2022

Derandomized knockoffs: leveraging e-values for false discovery rate control

Model-X knockoffs is a flexible wrapper method for high-dimensional regr...
research
02/27/2020

False Discovery Rate Control Under General Dependence By Symmetrized Data Aggregation

We develop a new class of distribution–free multiple testing rules for f...
research
11/21/2019

Controlling the FDR in variable selection via multiple knockoffs

Barber and Candes recently introduced a feature selection method called ...
research
10/26/2018

Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization

The Model-X knockoff procedure has recently emerged as a powerful approa...
research
07/18/2014

Extensions of stability selection using subsamples of observations and covariates

We introduce extensions of stability selection, a method to stabilise va...

Please sign up or login with your details

Forgot password? Click here to reset