Training Transformers for Information Security Tasks: A Case Study on Malicious URL Prediction

11/05/2020
by   Ethan M. Rudd, et al.
0

Machine Learning (ML) for information security (InfoSec) utilizes distinct data types and formats which require different treatments during optimization/training on raw data. In this paper, we implement a malicious/benign URL predictor based on a transformer architecture that is trained from scratch. We show that in contrast to conventional natural language processing (NLP) transformers, this model requires a different training approach to work well. Specifically, we show that 1) pre-training on a massive corpus of unlabeled URL data for an auto-regressive task does not readily transfer to malicious/benign prediction but 2) that using an auxiliary auto-regressive loss improves performance when training from scratch. We introduce a method for mixed objective optimization, which dynamically balances contributions from both loss terms so that neither one of them dominates. We show that this method yields performance comparable to that of several top-performing benchmark classifiers.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2021

MalBERT: Using Transformers for Cybersecurity and Malicious Software Detection

In recent years we have witnessed an increase in cyber threats and malic...
research
07/26/2021

Don't Sweep your Learning Rate under the Rug: A Closer Look at Cross-modal Transfer of Pretrained Transformers

Self-supervised pre-training of large-scale transformer models on text c...
research
08/17/2021

Investigating transformers in the decomposition of polygonal shapes as point collections

Transformers can generate predictions in two approaches: 1. auto-regress...
research
04/20/2021

Efficient pre-training objectives for Transformers

The Transformer architecture deeply changed the natural language process...
research
12/14/2022

Towards Linguistically Informed Multi-Objective Pre-Training for Natural Language Inference

We introduce a linguistically enhanced combination of pre-training metho...

Please sign up or login with your details

Forgot password? Click here to reset