Effectiveness of Distillation Attack and Countermeasure on Neural Network Watermarking

06/14/2019
by   Ziqi Yang, et al.
0

The rise of machine learning as a service and model sharing platforms has raised the need of traitor-tracing the models and proof of authorship. Watermarking technique is the main component of existing methods for protecting copyright of models. In this paper, we show that distillation, a widely used transformation technique, is a quite effective attack to remove watermark embedded by existing algorithms. The fragility is due to the fact that distillation does not retain the watermark embedded in the model that is redundant and independent to the main learning task. We design ingrain in response to the destructive distillation. It regularizes a neural network with an ingrainer model, which contains the watermark, and forces the model to also represent the knowledge of the ingrainer. Our extensive evaluations show that ingrain is more robust to distillation attack and its robustness against other widely used transformation techniques is comparable to existing methods.

READ FULL TEXT

page 10

page 11

page 14

page 15

research
07/14/2016

Defensive Distillation is Not Robust to Adversarial Examples

We show that defensive distillation is not secure: it is no more resista...
research
12/10/2022

LEAD: Liberal Feature-based Distillation for Dense Retrieval

Knowledge distillation is often used to transfer knowledge from a strong...
research
05/03/2021

Black-Box Dissector: Towards Erasing-based Hard-Label Model Stealing Attack

Model stealing attack aims to create a substitute model that steals the ...
research
05/09/2023

BadCS: A Backdoor Attack Framework for Code search

With the development of deep learning (DL), DL-based code search models ...
research
03/05/2022

Cosine Model Watermarking Against Ensemble Distillation

Many model watermarking methods have been developed to prevent valuable ...
research
11/16/2020

Neural network algorithm and its application in reactive distillation

Reactive distillation is a special distillation technology based on the ...
research
05/21/2020

Why distillation helps: a statistical perspective

Knowledge distillation is a technique for improving the performance of a...

Please sign up or login with your details

Forgot password? Click here to reset