Dropout Attacks

09/04/2023
by   Andrew Yuan, et al.
0

Dropout is a common operator in deep learning, aiming to prevent overfitting by randomly dropping neurons during training. This paper introduces a new family of poisoning attacks against neural networks named DROPOUTATTACK. DROPOUTATTACK attacks the dropout operator by manipulating the selection of neurons to drop instead of selecting them uniformly at random. We design, implement, and evaluate four DROPOUTATTACK variants that cover a broad range of scenarios. These attacks can slow or stop training, destroy prediction accuracy of target classes, and sabotage either precision or recall of a target class. In our experiments of training a VGG-16 model on CIFAR-100, our attack can reduce the precision of the victim class by 34.6 incurring any degradation in model accuracy

READ FULL TEXT
research
12/10/2018

Guided Dropout

Dropout is often used in deep neural networks to prevent over-fitting. C...
research
05/23/2018

Excitation Dropout: Encouraging Plasticity in Deep Neural Networks

We propose a guided dropout regularizer for deep networks based on the e...
research
05/23/2019

Multi-Sample Dropout for Accelerated Training and Better Generalization

Dropout is a simple but efficient regularization technique for achieving...
research
02/06/2016

Improved Dropout for Shallow and Deep Learning

Dropout has been witnessed with great success in training deep neural ne...
research
04/05/2022

A Survey on Dropout Methods and Experimental Verification in Recommendation

Overfitting is a common problem in machine learning, which means the mod...
research
11/28/2019

Continuous Dropout

Dropout has been proven to be an effective algorithm for training robust...
research
10/10/2020

Training Binary Neural Networks through Learning with Noisy Supervision

This paper formalizes the binarization operations over neural networks f...

Please sign up or login with your details

Forgot password? Click here to reset