Normalizing Flows for Knockoff-free Controlled Feature Selection

by   Derek Hansen, et al.

The goal of controlled feature selection is to discover the features a response depends on while limiting the proportion of false discoveries to a predefined level. Recently, multiple methods have been proposed that use deep learning to generate knockoffs for controlled feature selection through the Model-X knockoff framework. We demonstrate, however, that these methods often fail to control the false discovery rate (FDR). There are two reasons for this shortcoming. First, these methods often learn inaccurate models of features. Second, the "swap" property, which is required for knockoffs to be valid, is often not well enforced. We propose a new procedure called FlowSelect that remedies both of these problems. To more accurately model the features, FlowSelect uses normalizing flows, the state-of-the-art method for density estimation. To circumvent the need to enforce the swap property, FlowSelect uses a novel MCMC-based procedure to directly compute p-values for each feature. Asymptotically, FlowSelect controls the FDR exactly. Empirically, FlowSelect controls the FDR well on both synthetic and semi-synthetic benchmarks, whereas competing knockoff-based approaches fail to do so. FlowSelect also demonstrates greater power on these benchmarks. Additionally, using data from a genome-wide association study of soybeans, FlowSelect correctly infers the genetic variants associated with specific soybean traits.



There are no comments yet.


page 6


Controlling the FDR in variable selection via multiple knockoffs

Barber and Candes recently introduced a feature selection method called ...

Deep-gKnock: nonlinear group-feature selection with deep neural network

Feature selection is central to contemporary high-dimensional data analy...

A Scale-free Approach for False Discovery Rate Control in Generalized Linear Models

The generalized linear models (GLM) have been widely used in practice to...

Powerful Knockoffs via Minimizing Reconstructability

Model-X knockoffs allows analysts to perform feature selection using alm...

Subsampling Winner Algorithm for Feature Selection in Large Regression Data

Feature selection from a large number of covariates (aka features) in a ...

Improving the Stability of the Knockoff Procedure: Multiple Simultaneous Knockoffs and Entropy Maximization

The Model-X knockoff procedure has recently emerged as a powerful approa...

Deep Direct Likelihood Knockoffs

Predictive modeling often uses black box machine learning methods, such ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.