Classification with Strategically Withheld Data

12/18/2020
by   Anilesh K. Krishnaswamy, et al.
0

Machine learning techniques can be useful in applications such as credit approval and college admission. However, to be classified more favorably in such contexts, an agent may decide to strategically withhold some of her features, such as bad test scores. This is a missing data problem with a twist: which data is missing depends on the chosen classifier, because the specific classifier is what may create the incentive to withhold certain feature values. We address the problem of training classifiers that are robust to this behavior. We design three classification methods: Mincut, Hill-Climbing (HC) and Incentive-Compatible Logistic Regression (IC-LR). We show that Mincut is optimal when the true distribution of data is fully known. However, it can produce complex decision boundaries, and hence be prone to overfitting in some cases. Based on a characterization of truthful classifiers (i.e., those that give no incentive to strategically hide features), we devise a simpler alternative called HC which consists of a hierarchical ensemble of out-of-the-box classifiers, trained using a specialized hill-climbing procedure which we show to be convergent. For several reasons, Mincut and HC are not effective in utilizing a large number of complementarily informative features. To this end, we present IC-LR, a modification of Logistic Regression that removes the incentive to strategically drop features. We also show that our algorithms perform well in experiments on real-world data sets, and present insights into their relative performance in different settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2019

What to Expect of Classifiers? Reasoning about Logistic Regression with Missing Features

While discriminative classifiers often yield strong predictive performan...
research
02/01/2022

Safe Screening for Logistic Regression with ℓ_0-ℓ_2 Regularization

In logistic regression, it is often desirable to utilize regularization ...
research
08/31/2019

Detecting floodwater on roadways from image data with handcrafted features and deep transfer learning

Detecting roadway segments inundated due to floodwater has important app...
research
01/11/2023

Optirank: classification for RNA-Seq data with optimal ranking reference genes

Classification algorithms using RNA-Sequencing (RNA-Seq) data as input a...
research
12/15/2021

Robust Neural Network Classification via Double Regularization

The presence of mislabeled observations in data is a notoriously challen...
research
05/29/2016

A budget-constrained inverse classification framework for smooth classifiers

Inverse classification is the process of manipulating an instance such t...
research
09/14/2021

Variation-Incentive Loss Re-weighting for Regression Analysis on Biased Data

Both classification and regression tasks are susceptible to the biased d...

Please sign up or login with your details

Forgot password? Click here to reset