A Random Subspace Technique That Is Resistant to a Limited Number of Features Corrupted by an Adversary

02/19/2019
by   Chris Mesterharm, et al.
0

In this paper, we consider batch supervised learning where an adversary is allowed to corrupt instances with arbitrarily large noise. The adversary is allowed to corrupt any l features in each instance and the adversary can change their values in any way. This noise is introduced on test instances and the algorithm receives no label feedback for these instances. We provide several subspace voting techniques that can be used to transform existing algorithms and prove data-dependent performance bounds in this setting. The key insight to our results is that we set our parameters so that a significant fraction of the voting hypotheses do not contain corrupt features and, for many real world problems, these uncorrupt hypotheses are sufficient to achieve high accuracy. We empirically validate our approach on several datasets including three new datasets that deal with side channel electromagnetic information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/27/2011

Online Learning: Stochastic and Constrained Adversaries

Learning theory has largely focused on two main learning scenarios. The ...
research
09/06/2019

Game Theoretical Approach to Sequential Hypothesis Test with Byzantine Sensors

In this paper, we consider the problem of sequential binary hypothesis t...
research
09/07/2021

Instance-dependent Label-noise Learning under a Structural Causal Model

Label noise will degenerate the performance of deep learning algorithms ...
research
10/04/2018

Finding Solutions to Generative Adversarial Privacy

We present heuristics for solving the maximin problem induced by the gen...
research
06/11/2021

Corruption-Robust Offline Reinforcement Learning

We study the adversarial robustness in offline reinforcement learning. G...
research
07/13/2022

Game of Trojans: A Submodular Byzantine Approach

Machine learning models in the wild have been shown to be vulnerable to ...

Please sign up or login with your details

Forgot password? Click here to reset