Law and Adversarial Machine Learning

10/25/2018
by   Ram Shankar Siva Kumar, et al.
0

When machine learning systems fail because of adversarial manipulation, what kind of legal relief can society expect? Through scenarios grounded in adversarial ML literature, we explore how some aspects of computer crime, copyright, and tort law interface with perturbation, poisoning and model stealing, model inversion attacks to show how some attacks are more likely to result in liability than others. We end with a call for action to ML researchers to invest in transparent benchmarks of attacks and defenses; architect ML systems with forensics in mind and finally, think more about adversarial machine learning in the context of civil liberties. The paper is targeted towards ML researchers who have no legal background.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/29/2020

Legal Risks of Adversarial Machine Learning Research

Adversarial Machine Learning is booming with ML researchers increasingly...
research
02/01/2020

Politics of Adversarial Machine Learning

In addition to their security properties, adversarial machine-learning a...
research
07/11/2021

Adversarial for Good? How the Adversarial ML Community's Values Impede Socially Beneficial Uses of Attacks

Attacks from adversarial machine learning (ML) have the potential to be ...
research
06/23/2022

Non-Determinism and the Lawlessness of ML Code

Legal literature on machine learning (ML) tends to focus on harms, and a...
research
10/24/2022

Ares: A System-Oriented Wargame Framework for Adversarial ML

Since the discovery of adversarial attacks against machine learning mode...
research
07/12/2018

Algorithms that Remember: Model Inversion Attacks and Data Protection Law

Many individuals are concerned about the governance of machine learning ...
research
09/24/2020

Legally grounded fairness objectives

Recent work has identified a number of formally incompatible operational...

Please sign up or login with your details

Forgot password? Click here to reset