Achieving Adversarial Robustness via Sparsity

09/11/2020
by   Shufan Wang, et al.
9

Network pruning has been known to produce compact models without much accuracy degradation. However, how the pruning process affects a network's robustness and the working mechanism behind remain unresolved. In this work, we theoretically prove that the sparsity of network weights is closely associated with model robustness. Through experiments on a variety of adversarial pruning methods, we find that weights sparsity will not hurt but improve robustness, where both weights inheritance from the lottery ticket and adversarial training improve model robustness in network pruning. Based on these findings, we propose a novel adversarial training method called inverse weights inheritance, which imposes sparse weights distribution on a large network by inheriting weights from a small network, thereby improving the robustness of the large network.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2022

Finding Dynamics Preserving Adversarial Winning Tickets

Modern deep neural networks (DNNs) are vulnerable to adversarial attacks...
research
03/29/2019

Second Rethinking of Network Pruning in the Adversarial Setting

It is well known that deep neural networks (DNNs) are vulnerable to adve...
research
04/10/2020

Blind Adversarial Pruning: Balance Accuracy, Efficiency and Robustness

With the growth of interest in the attack and defense of deep neural net...
research
05/21/2019

Revisiting hard thresholding for DNN pruning

The most common method for DNN pruning is hard thresholding of network w...
research
08/10/2021

On the Effect of Pruning on Adversarial Robustness

Pruning is a well-known mechanism for reducing the computational cost of...
research
09/10/2021

Dynamic Collective Intelligence Learning: Finding Efficient Sparse Model via Refined Gradients for Pruned Weights

With the growth of deep neural networks (DNN), the number of DNN paramet...
research
06/15/2022

Can pruning improve certified robustness of neural networks?

With the rapid development of deep learning, the sizes of neural network...

Please sign up or login with your details

Forgot password? Click here to reset