Adversarial Classification via Distributional Robustness with Wasserstein Ambiguity

by   Nam Ho-Nguyen, et al.

We study a model for adversarial classification based on distributionally robust chance constraints. We show that under Wasserstein ambiguity, the model aims to minimize the conditional value-at-risk of the distance to misclassification, and we explore links to previous adversarial classification models and maximum margin classifiers. We also provide a reformulation of the distributionally robust model for linear classifiers, and show it is equivalent to minimizing a regularized ramp loss. Numerical experiments show that, despite the nonconvexity, standard descent methods appear to converge to the global minimizer for this problem. Inspired by this observation, we show that, for a certain benign distribution, the regularized ramp loss minimization problem has a single stationary point, at the global minimizer.



page 1

page 2

page 3

page 4


Distributionally Robust Prescriptive Analytics with Wasserstein Distance

In prescriptive analytics, the decision-maker observes historical sample...

Wasserstein Distributionally Robust Kalman Filtering

We study a distributionally robust mean square error estimation problem ...

Implementation of batched Sinkhorn iterations for entropy-regularized Wasserstein loss

In this report, we review the calculation of entropy-regularised Wassers...

Distributionally Robust Multiclass Classification and Applications in Deep CNN Image Classifiers

We develop a Distributionally Robust Optimization (DRO) formulation for ...

Wasserstein Robust Support Vector Machines with Fairness Constraints

We propose a distributionally robust support vector machine with a fairn...

Boosted CVaR Classification

Many modern machine learning tasks require models with high tail perform...

MRCpy: A Library for Minimax Risk Classifiers

Existing libraries for supervised classification implement techniques th...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.