Text Steganalysis with Attentional LSTM-CNN

12/30/2019
by   YongJian Bao, et al.
0

With the rapid development of Natural Language Processing (NLP) technologies, text steganography methods have been significantly innovated recently, which poses a great threat to cybersecurity. In this paper, we propose a novel attentional LSTM-CNN model to tackle the text steganalysis problem. The proposed method firstly maps words into semantic space for better exploitation of the semantic feature in texts and then utilizes a combination of Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) recurrent neural networks to capture both local and long-distance contextual information in steganography texts. In addition, we apply attention mechanism to recognize and attend to important clues within suspicious sentences. After merge feature clues from Convolutional Neural Networks (CNNs) and Recurrent Neural Networks (RNNs), we use a softmax layer to categorize the input text as cover or stego. Experiments showed that our model can achieve the state-of-art result in the text steganalysis task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2016

Long Short-Term Memory based Convolutional Recurrent Neural Networks for Large Vocabulary Speech Recognition

Long short-term memory (LSTM) recurrent neural networks (RNNs) have been...
research
08/28/2018

Convolutional Neural Networks with Recurrent Neural Filters

We introduce a class of convolutional neural networks (CNNs) that utiliz...
research
10/24/2018

Predicting the Semantic Textual Similarity with Siamese CNN and LSTM

Semantic Textual Similarity (STS) is the basis of many applications in N...
research
09/22/2022

A Case Report On The "A.I. Locked-In Problem": social concerns with modern NLP

Modern NLP models are becoming better conversational agents than their p...
research
01/23/2020

Low-Complexity LSTM Training and Inference with FloatSD8 Weight Representation

The FloatSD technology has been shown to have excellent performance on l...
research
05/28/2022

Go Beyond Multiple Instance Neural Networks: Deep-learning Models based on Local Pattern Aggregation

Deep convolutional neural networks (CNNs) have brought breakthroughs in ...
research
07/20/2016

Sequence to sequence learning for unconstrained scene text recognition

In this work we present a state-of-the-art approach for unconstrained na...

Please sign up or login with your details

Forgot password? Click here to reset