Binary Stochastic Filtering: feature selection and beyond

07/08/2020
by   Andrii Trelin, et al.
0

Feature selection is one of the most decisive tools in understanding data and machine learning models. Among other methods, sparsity induced by L^1 penalty is one of the simplest and best studied approaches to this problem. Although such regularization is frequently used in neural networks to achieve sparsity of weights or unit activations, it is unclear how it can be employed in the feature selection problem. This work aims at extending the neural network with ability to automatically select features by rethinking how the sparsity regularization can be used, namely, by stochastically penalizing feature involvement instead of the layer weights. The proposed method has demonstrated superior efficiency when compared to a few classical methods, achieved with minimal or no computational overhead, and can be directly applied to any existing architecture. Furthermore, the method is easily generalizable for neuron pruning and selection of regions of importance for spectral data.

READ FULL TEXT
research
02/12/2019

Binary Stochastic Filtering: a Solution for Supervised Feature Selection and Neural Network Shape Optimization

Binary Stochastic Filtering (BSF), the algorithm for feature selection a...
research
04/21/2023

Effective Neural Network L_0 Regularization With BinMask

L_0 regularization of neural networks is a fundamental problem. In addit...
research
10/03/2022

Sparsity by Redundancy: Solving L_1 with a Simple Reparametrization

We identify and prove a general principle: L_1 sparsity can be achieved ...
research
12/13/2019

Neural Network Surgery with Sets

The cost to train machine learning models has been increasing exponentia...
research
02/14/2011

Feature selection via simultaneous sparse approximation for person specific face verification

There is an increasing use of some imperceivable and redundant local fea...
research
03/20/2023

Induced Feature Selection by Structured Pruning

The advent of sparsity inducing techniques in neural networks has been o...
research
09/29/2022

Sequential Attention for Feature Selection

Feature selection is the problem of selecting a subset of features for a...

Please sign up or login with your details

Forgot password? Click here to reset