Mitigating Adversarial Attacks in Deepfake Detection: An Exploration of Perturbation and AI Techniques

02/22/2023
by   Saminder Dhesi, et al.
0

Deep learning is a crucial aspect of machine learning, but it also makes these techniques vulnerable to adversarial examples, which can be seen in a variety of applications. These examples can even be targeted at humans, leading to the creation of false media, such as deepfakes, which are often used to shape public opinion and damage the reputation of public figures. This article will explore the concept of adversarial examples, which are comprised of perturbations added to clean images or videos, and their ability to deceive DL algorithms. The proposed approach achieved a precision value of accuracy of 76.2

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/06/2023

Reactive Perturbation Defocusing for Textual Adversarial Defense

Recent studies have shown that large pre-trained language models are vul...
research
02/23/2021

Adversarial Examples Detection beyond Image Space

Deep neural networks have been proved that they are vulnerable to advers...
research
01/31/2023

Reverse engineering adversarial attacks with fingerprints from adversarial examples

In spite of intense research efforts, deep neural networks remain vulner...
research
09/27/2020

Beneficial Perturbations Network for Defending Adversarial Examples

Adversarial training, in which a network is trained on both adversarial ...
research
09/25/2018

Neural Networks with Structural Resistance to Adversarial Attacks

In adversarial attacks to machine-learning classifiers, small perturbati...
research
02/16/2019

Adversarial Examples in RF Deep Learning: Detection of the Attack and its Physical Robustness

While research on adversarial examples in machine learning for images ha...
research
04/13/2021

Detecting Operational Adversarial Examples for Reliable Deep Learning

The utilisation of Deep Learning (DL) raises new challenges regarding it...

Please sign up or login with your details

Forgot password? Click here to reset