Outage Performance and Novel Loss Function for an ML-Assisted Resource Allocation: An Exact Analytical Framework

05/16/2023
by   Nidhi Simmons, et al.
0

Machine Learning (ML) is a popular tool that will be pivotal in enabling 6G and beyond communications. This paper focuses on applying ML solutions to address outage probability issues commonly encountered in these systems. In particular, we consider a single-user multi-resource greedy allocation strategy, where an ML binary classification predictor assists in seizing an adequate resource. With no access to future channel state information, this predictor foresees each resource's likely future outage status. When the predictor encounters a resource it believes will be satisfactory, it allocates it to the user. Critically, the goal of the predictor is to ensure that a user avoids an unsatisfactory resource since this is likely to cause an outage. Our main result establishes exact and asymptotic expressions for this system's outage probability. With this, we formulate a theoretically optimal, differentiable loss function to train our predictor. We then compare predictors trained using this and traditional loss functions; namely, binary cross-entropy (BCE), mean squared error (MSE), and mean absolute error (MAE). Predictors trained using our novel loss function provide superior outage probability in all scenarios. Our loss function sometimes outperforms predictors trained with the BCE, MAE, and MSE loss functions by multiple orders of magnitude.

READ FULL TEXT

page 1

page 5

page 12

research
10/31/2022

Xtreme Margin: A Tunable Loss Function for Binary Classification Problems

Loss functions drive the optimization of machine learning algorithms. Th...
research
08/28/2017

On denoising autoencoders trained to minimise binary cross-entropy

Denoising autoencoders (DAEs) are powerful deep learning models used for...
research
10/25/2022

Bit Error and Block Error Rate Training for ML-Assisted Communication

Even though machine learning (ML) techniques are being widely used in co...
research
09/21/2020

Massive MIMO Channel Prediction: Kalman Filtering vs. Machine Learning

This paper focuses on channel prediction techniques for massive multiple...
research
12/17/2018

Joint Rate and Resource Allocation in Hybrid Digital-Analog Transmission over Fading Channels

In hybrid digital-analog (HDA) systems, resource allocation has been uti...
research
09/15/2020

Competing AI: How does competition feedback affect machine learning?

This papers studies how competition affects machine learning (ML) predic...
research
09/21/2020

Copula-Based Bounds for Multi-User Communications – Part II: Outage Performance

In the first part of this two-part letter, we introduced methods to stud...

Please sign up or login with your details

Forgot password? Click here to reset