Automatic Machine Learning for Multi-Receiver CNN Technology Classifiers

Convolutional Neural Networks (CNNs) are one of the most studied family of deep learning models for signal classification, including modulation, technology, detection, and identification. In this work, we focus on technology classification based on raw I/Q samples collected from multiple synchronized receivers. As an example use case, we study protocol identification of Wi-Fi, LTE-LAA, and 5G NR-U technologies that coexist over the 5 GHz Unlicensed National Information Infrastructure (U-NII) bands. Designing and training accurate CNN classifiers involve significant time and effort that goes into fine-tuning a model's architectural settings and determining the appropriate hyperparameter configurations, such as learning rate and batch size. We tackle the former by defining architectural settings themselves as hyperparameters. We attempt to automatically optimize these architectural parameters, along with other preprocessing (e.g., number of I/Q samples within each classifier input) and learning hyperparameters, by forming a Hyperparameter Optimization (HyperOpt) problem, which we solve in a near-optimal fashion using the Hyperband algorithm. The resulting near-optimal CNN (OCNN) classifier is then used to study classification accuracy for OTA as well as simulations datasets, considering various SNR values. We show that the number of receivers to construct multi-channel inputs for CNNs should be defined as a preprocessing hyperparameter to be optimized via Hyperband. OTA results reveal that our OCNN classifiers improve classification accuracy by 24.58 tuned CNNs. We also study the effect of min-max normalization of I/Q samples within each classifier's input on generalization accuracy over simulated datasets with SNRs other than training set's SNR and show an average of 108.05 improvement when I/Q samples are normalized.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/30/2020

Weighted Random Search for CNN Hyperparameter Optimization

Nearly all model algorithms used in machine learning use two different s...
research
03/13/2023

SA-CNN: Application to text categorization issues using simulated annealing-based convolutional neural network optimization

Convolutional neural networks (CNNs) are a representative class of deep ...
research
09/26/2022

Improving Multi-fidelity Optimization with a Recurring Learning Rate for Hyperparameter Tuning

Despite the evolution of Convolutional Neural Networks (CNNs), their per...
research
05/16/2019

Deep Learning for Interference Identification: Band, Training SNR, and Sample Selection

We study the problem of interference source identification, through the ...
research
09/15/2020

A Study of Genetic Algorithms for Hyperparameter Optimization of Neural Networks in Machine Translation

With neural networks having demonstrated their versatility and benefits,...
research
01/01/2021

ECG-Based Driver Stress Levels Detection System Using Hyperparameter Optimization

Stress and driving are a dangerous combination which can lead to crashes...
research
02/07/2023

Towards causally linking architectural parametrizations to algorithmic bias in neural networks

Training dataset biases are by far the most scrutinized factors when exp...

Please sign up or login with your details

Forgot password? Click here to reset