Deep Knockoffs

11/16/2018
by   Yaniv Romano, et al.
0

This paper introduces a machine for sampling approximate model-X knockoffs for arbitrary and unspecified data distributions using deep generative models. The main idea is to iteratively refine a knockoff sampling mechanism until a criterion measuring the validity of the produced knockoffs is optimized; this criterion is inspired by the popular maximum mean discrepancy in machine learning and can be thought of as measuring the distance to pairwise exchangeability between original and knockoff features. By building upon the existing model-X framework, we thus obtain a flexible and model-free statistical tool to perform controlled variable selection. Extensive numerical experiments and quantitative tests confirm the generality, effectiveness, and power of our deep knockoff machines. Finally, we apply this new method to a real study of mutations linked to changes in drug resistance in the human immunodeficiency virus.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/02/2017

On Unifying Deep Generative Models

Deep generative models have achieved impressive success in recent years....
research
09/27/2018

Auto-Encoding Knockoff Generator for FDR Controlled Variable Selection

A new statistical procedure (Model-X candes2018) has provided a way to i...
research
02/20/2020

Knockoff Boosted Tree for Model-Free Variable Selection

In this article, we propose a novel strategy for conducting variable sel...
research
09/27/2022

Optimization of Annealed Importance Sampling Hyperparameters

Annealed Importance Sampling (AIS) is a popular algorithm used to estima...
research
10/01/2014

Deep Tempering

Restricted Boltzmann Machines (RBMs) are one of the fundamental building...
research
12/02/2019

KernelNet: A Data-Dependent Kernel Parameterization for Deep Generative Modeling

Learning with kernels is an often resorted tool in modern machine learni...

Please sign up or login with your details

Forgot password? Click here to reset