ES Attack: Model Stealing against Deep Neural Networks without Data Hurdles

09/21/2020
by   Xiaoyong Yuan, et al.
0

Deep neural networks (DNNs) have become the essential components for various commercialized machine learning services, such as Machine Learning as a Service (MLaaS). Recent studies show that machine learning services face severe privacy threats - well-trained DNNs owned by MLaaS providers can be stolen through public APIs, namely model stealing attacks. However, most existing works undervalued the impact of such attacks, where a successful attack has to acquire confidential training data or auxiliary data regarding the victim DNN. In this paper, we propose ES Attack, a novel model stealing attack without any data hurdles. By using heuristically generated synthetic data, ES Attackiteratively trains a substitute model and eventually achieves a functionally equivalent copy of the victim DNN. The experimental results reveal the severity of ES Attack: i) ES Attack successfully steals the victim model without data hurdles, and ES Attack even outperforms most existing model stealing attacks using auxiliary data in terms of model accuracy; ii) most countermeasures are ineffective in defending ES Attack; iii) ES Attack facilitates further attacks relying on the stolen model.

READ FULL TEXT
research
01/31/2022

Imperceptible and Multi-channel Backdoor Attack against Deep Neural Networks

Recent researches demonstrate that Deep Neural Networks (DNN) models are...
research
02/26/2020

Defending against Backdoor Attack on Deep Neural Networks

Although deep neural networks (DNNs) have achieved a great success in va...
research
06/16/2022

I Know What You Trained Last Summer: A Survey on Stealing Machine Learning Models and Defences

Machine Learning-as-a-Service (MLaaS) has become a widespread paradigm, ...
research
06/08/2022

Can Backdoor Attacks Survive Time-Varying Models?

Backdoors are powerful attacks against deep neural networks (DNNs). By p...
research
11/09/2021

QUDOS: Quorum-Based Cloud-Edge Distributed DNNs for Security Enhanced Industry 4.0

Distributed machine learning algorithms that employ Deep Neural Networks...
research
03/03/2017

Generative Poisoning Attack Method Against Neural Networks

Poisoning attack is identified as a severe security threat to machine le...
research
06/01/2022

NeuroUnlock: Unlocking the Architecture of Obfuscated Deep Neural Networks

The advancements of deep neural networks (DNNs) have led to their deploy...

Please sign up or login with your details

Forgot password? Click here to reset