Differentiable Subset Sampling

01/29/2019
by   Sang Michael Xie, et al.
2

Many machine learning tasks require sampling a subset of items from a collection. Due to the non-differentiability of subset sampling, the procedure is usually not included in end-to-end deep learning models. We show that through a connection to weighted reservoir sampling, the Gumbel-max trick can be extended to produce exact subset samples, and that a recently proposed top-k relaxation can be used to differentiate through the subset sampling procedure. We test our method on end-to-end tasks requiring subset sampling, including a differentiable k-nearest neighbors task and an instance-wise feature selection task for model interpretability.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/04/2022

SIMPLE: A Gradient Estimator for k-Subset Sampling

k-subset sampling is ubiquitous in machine learning, enabling regulariza...
research
03/01/2019

Parallel Weighted Random Sampling

Data structures for efficient sampling from a set of weighted items are ...
research
07/26/2018

Superpixel Sampling Networks

Superpixels provide an efficient low/mid-level representation of image d...
research
01/07/2019

DPPNet: Approximating Determinantal Point Processes with Deep Networks

Determinantal Point Processes (DPPs) provide an elegant and versatile wa...
research
01/09/2023

Differentiable Simulations for Enhanced Sampling of Rare Events

We develop a novel approach to enhanced sampling of chemically reactive ...
research
05/28/2021

Differentiable Artificial Reverberation

We propose differentiable artificial reverberation (DAR), a family of ar...
research
08/17/2019

CompenNet++: End-to-end Full Projector Compensation

Full projector compensation aims to modify a projector input image such ...

Please sign up or login with your details

Forgot password? Click here to reset