DeepAI
Log In Sign Up

Aggregation of Multiple Knockoffs

02/21/2020
by   Binh T. Nguyen, et al.
9

We develop an extension of the Knockoff Inference procedure, introduced by Barber and Candes (2015). This new method, called Aggregation of Multiple Knockoffs (AKO), addresses the instability inherent to the random nature of Knockoff-based inference. Specifically, AKO improves both the stability and power compared with the original Knockoff algorithm while still maintaining guarantees for False Discovery Rate control. We provide a new inference procedure, prove its core properties, and demonstrate its benefits in a set of experiments on synthetic and real datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

07/08/2019

Aggregated False Discovery Rate Control

We propose an aggregation scheme for methods that control the false disc...
07/21/2020

ADAGES: adaptive aggregation with stability for distributed feature selection

In this era of "big" data, not only the large amount of data keeps motiv...
05/30/2022

Derandomized knockoffs: leveraging e-values for false discovery rate control

Model-X knockoffs is a flexible wrapper method for high-dimensional regr...
02/27/2020

False Discovery Rate Control Under General Dependence By Symmetrized Data Aggregation

We develop a new class of distribution–free multiple testing rules for f...
11/21/2019

Controlling the FDR in variable selection via multiple knockoffs

Barber and Candes recently introduced a feature selection method called ...
09/02/2019

Analysis of SparseHash: an efficient embedding of set-similarity via sparse projections

Embeddings provide compact representations of signals in order to perfor...

Code Repositories

multiknockoffs

Estimation of multiple knockoff procedures


view repo