Is Attentional Channel Processing Design Required? Comprehensive Analysis Of Robustness Between Vision Transformers And Fully Attentional Networks

06/08/2023
by   Abhishri Ajit Medewar, et al.
0

The robustness testing has been performed for standard CNN models and Vision Transformers, however there is a lack of comprehensive study between the robustness of traditional Vision Transformers without an extra attentional channel design and the latest fully attentional network(FAN) models. So in this paper, we use the ImageNet dataset to compare the robustness of fully attentional network(FAN) models with traditional Vision Transformers to understand the role of an attentional channel processing design using white box attacks and also study the transferability between the same using black box attacks.

READ FULL TEXT
research
04/26/2022

Understanding The Robustness in Vision Transformers

Recent studies show that Vision Transformers(ViTs) exhibit strong robust...
research
09/07/2022

Securing the Spike: On the Transferabilty and Security of Spiking Neural Networks to Adversarial Examples

Spiking neural networks (SNNs) have attracted much attention for their h...
research
10/14/2022

Pretrained Transformers Do not Always Improve Robustness

Pretrained Transformers (PT) have been shown to improve Out of Distribut...
research
12/30/2021

Stochastic Layers in Vision Transformers

We introduce fully stochastic layers in vision transformers, without cau...
research
08/01/2022

Understanding Adversarial Robustness of Vision Transformers via Cauchy Problem

Recent research on the robustness of deep learning has shown that Vision...
research
10/24/2022

The Robustness Limits of SoTA Vision Models to Natural Variation

Recent state-of-the-art vision models introduced new architectures, lear...
research
08/30/2023

Emergence of Segmentation with Minimalistic White-Box Transformers

Transformer-like models for vision tasks have recently proven effective ...

Please sign up or login with your details

Forgot password? Click here to reset