Kernel Knockoffs Selection for Nonparametric Additive Models

05/25/2021
by   Xiaowu Dai, et al.
0

Thanks to its fine balance between model flexibility and interpretability, the nonparametric additive model has been widely used, and variable selection for this type of model has received constant attention. However, none of the existing solutions can control the false discovery rate (FDR) under the finite sample setting. The knockoffs framework is a recent proposal that can effectively control the FDR with a finite sample size, but few knockoffs solutions are applicable to nonparametric models. In this article, we propose a novel kernel knockoffs selection procedure for the nonparametric additive model. We integrate three key components: the knockoffs, the subsampling for stability, and the random feature mapping for nonparametric function approximation. We show that the proposed method is guaranteed to control the FDR under any finite sample size, and achieves a power that approaches one as the sample size tends to infinity. We demonstrate the efficacy of our method through intensive numerical analyses and comparisons with the alternative solutions. Our proposal thus makes useful contributions to the methodology of nonparametric variable selection, FDR-based inference, as well as knockoffs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset