LSTM Hyper-Parameter Selection for Malware Detection: Interaction Effects and Hierarchical Selection Approach

09/23/2021
by   Mohit Sewak, et al.
0

Long-Short-Term-Memory (LSTM) networks have shown great promise in artificial intelligence (AI) based language modeling. Recently, LSTM networks have also become popular for designing AI-based Intrusion Detection Systems (IDS). However, its applicability in IDS is studied largely in the default settings as used in language models. Whereas security applications offer distinct conditions and hence warrant careful consideration while applying such recurrent networks. Therefore, we conducted one of the most exhaustive works on LSTM hyper-parameters for IDS and experimented with approx. 150 LSTM configurations to determine its hyper-parameters relative importance, interaction effects, and optimal selection approach for designing an IDS. We conducted multiple analyses of the results of these experiments and empirically controlled for the interaction effects of different hyper-parameters covariate levels. We found that for security applications, especially for designing an IDS, neither similar relative importance as applicable to language models is valid, nor is the standard linear method for hyper-parameter selection ideal. We ascertained that the interaction effect plays a crucial role in determining the relative importance of hyper-parameters. We also discovered that after controlling for the interaction effect, the correct relative importance for LSTMs for an IDS is batch-size, followed by dropout ratio and padding. The findings are significant because when LSTM was first used for language models, the focus had mostly been on increasing the number of layers to enhance performance.

READ FULL TEXT
research
12/26/2020

Assessment of the Relative Importance of different hyper-parameters of LSTM for an IDS

Recurrent deep learning language models like the LSTM are often used to ...
research
12/08/2017

Characterizing the hyper-parameter space of LSTM language models for mixed context applications

Applying state of the art deep learning models to novel real world datas...
research
07/05/2021

A comparison of LSTM and GRU networks for learning symbolic sequences

We explore relations between the hyper-parameters of a recurrent neural ...
research
09/27/2020

Multi-timescale representation learning in LSTM Language Models

Although neural language models are effective at capturing statistics of...
research
12/19/2019

CNN-LSTM models for Multi-Speaker Source Separation using Bayesian Hyper Parameter Optimization

In recent years there have been many deep learning approaches towards th...
research
08/11/2015

Benchmarking of LSTM Networks

LSTM (Long Short-Term Memory) recurrent neural networks have been highly...

Please sign up or login with your details

Forgot password? Click here to reset