Improving knockoffs with conditional calibration

08/19/2022
by   Yixiang Luo, et al.
0

The knockoff filter of Barber and Candes (arXiv:1404.5609) is a flexible framework for multiple testing in supervised learning models, based on introducing synthetic predictor variables to control the false discovery rate (FDR). Using the conditional calibration framework of Fithian and Lei (arXiv:2007.10438), we introduce the calibrated knockoff procedure, a method that uniformly improves the power of any knockoff procedure. We implement our method for fixed-X knockoffs and show theoretically and empirically that the improvement is especially notable in two contexts where knockoff methods can be nearly powerless: when the rejection set is small, and when the structure of the design matrix prevents us from constructing good knockoff variables. In these contexts, calibrated knockoffs even outperform competing FDR-controlling methods like the (dependence-adjusted) Benjamini-Hochberg procedure in many scenarios.

READ FULL TEXT

page 21

page 22

research
07/20/2020

Conditional calibration for false discovery rate control under dependence

We introduce a new class of methods for finite-sample false discovery ra...
research
10/19/2020

A factor-adjusted multiple testing of general alternatives

Factor-adjusted multiple testing is used for handling strong correlated ...
research
11/21/2019

Controlling False Discovery Rate Using Gaussian Mirrors

Simultaneously finding multiple influential variables and controlling th...
research
10/03/2015

P-trac Procedure: The Dispersion and Neutralization of Contrasts in Lexicon

Cognitive acoustic cues have an important role in shaping the phonologic...
research
12/22/2017

Obtaining Accurate Probabilistic Causal Inference by Post-Processing Calibration

Discovery of an accurate causal Bayesian network structure from observat...
research
07/28/2023

Is this model reliable for everyone? Testing for strong calibration

In a well-calibrated risk prediction model, the average predicted probab...
research
09/11/2022

"Calibeating": Beating Forecasters at Their Own Game

In order to identify expertise, forecasters should not be tested by thei...

Please sign up or login with your details

Forgot password? Click here to reset