An Enhanced Convolutional Neural Network in Side-Channel Attacks and Its Visualization

09/18/2020
by   Minhui Jin, et al.
0

In recent years, the convolutional neural networks (CNNs) have received a lot of interest in the side-channel community. The previous work has shown that CNNs have the potential of breaking the cryptographic algorithm protected with masking or desynchronization. Before, several CNN models have been exploited, reaching the same or even better level of performance compared to the traditional side-channel attack (SCA). In this paper, we investigate the architecture of Residual Network and build a new CNN model called attention network. To enhance the power of the attention network, we introduce an attention mechanism - Convolutional Block Attention Module (CBAM) and incorporate CBAM into the CNN architecture. CBAM points out the informative points of the input traces and makes the attention network focus on the relevant leakages of the measurements. It is able to improve the performance of the CNNs. Because the irrelevant points will introduce the extra noises and cause a worse performance of attacks. We compare our attention network with the one designed for the masking AES implementation called ASCAD network in this paper. We show that the attention network has a better performance than the ASCAD network. Finally, a new visualization method, named Class Gradient Visualization (CGV) is proposed to recognize which points of the input traces have a positive influence on the predicted result of the neural networks. In another aspect, it can explain why the attention network is superior to the ASCAD network. We validate the attention network through extensive experiments on four public datasets and demonstrate that the attention network is efficient in different AES implementations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/28/2020

Efficient Attention Network: Accelerate Attention by Searching Where to Plug

Recently, many plug-and-play self-attention modules are proposed to enha...
research
12/01/2019

Not All Attention Is Needed: Gated Attention Network for Sequence Data

Although deep neural networks generally have fixed network structures, t...
research
01/17/2020

Methodology for Efficient CNN Architectures in Profiling Attacks

The side-channel community recently investigated a new approach, based o...
research
01/16/2019

UAN: Unified Attention Network for Convolutional Neural Networks

We propose a new architecture that learns to attend to different Convolu...
research
01/04/2020

Res3ATN – Deep 3D Residual Attention Network for Hand Gesture Recognition in Videos

Hand gesture recognition is a strenuous task to solve in videos. In this...
research
05/02/2022

Rethinking Gradient Operator for Exposing AI-enabled Face Forgeries

For image forensics, convolutional neural networks (CNNs) tend to learn ...
research
06/18/2023

Learn to Enhance the Negative Information in Convolutional Neural Network

This paper proposes a learnable nonlinear activation mechanism specifica...

Please sign up or login with your details

Forgot password? Click here to reset