Hybrid-Layers Neural Network Architectures for Modeling the Self-Interference in Full-Duplex Systems

10/18/2021
by   Mohamed Elsayed, et al.
0

Full-duplex (FD) systems have been introduced to provide high data rates for beyond fifth-generation wireless networks through simultaneous transmission of information over the same frequency resources. However, the operation of FD systems is practically limited by the self-interference (SI), and efficient SI cancelers are sought to make the FD systems realizable. Typically, polynomial-based cancelers are employed to mitigate the SI; nevertheless, they suffer from high complexity. This article proposes two novel hybrid-layers neural network (NN) architectures to cancel the SI with low complexity. The first architecture is referred to as hybrid-convolutional recurrent NN (HCRNN), whereas the second is termed as hybrid-convolutional recurrent dense NN (HCRDNN). In contrast to the state-of-the-art NNs that employ dense or recurrent layers for SI modeling, the proposed NNs exploit, in a novel manner, a combination of different hidden layers (e.g., convolutional, recurrent, and/or dense) in order to model the SI with lower computational complexity than the polynomial and the state-of-the-art NN-based cancelers. The key idea behind using hybrid layers is to build an NN model, which makes use of the characteristics of the different layers employed in its architecture. More specifically, in the HCRNN, a convolutional layer is employed to extract the input data features using a reduced network scale. Moreover, a recurrent layer is then applied to assist in learning the temporal behavior of the input signal from the localized feature map of the convolutional layer. In the HCRDNN, an additional dense layer is exploited to add another degree of freedom for adapting the NN settings in order to achieve the best compromise between the cancellation performance and computational complexity. Complexity analysis and numerical simulations are provided to prove the superiority of the proposed architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/23/2020

Low Complexity Neural Network Structures for Self-Interference Cancellation in Full-Duplex Radio

Self-interference (SI) is considered as a main challenge in full-duplex ...
research
07/15/2019

Comparison of Neural Network Architectures for Spectrum Sensing

Different neural network (NN) architectures have different advantages. C...
research
08/24/2023

SICNN: Soft Interference Cancellation Inspired Neural Network Equalizers

Equalization is an important task at the receiver side of a digital wire...
research
11/14/2021

A layer-stress learning framework universally augments deep neural network tasks

Deep neural networks (DNN) such as Multi-Layer Perception (MLP) and Conv...
research
09/01/2022

Recurrent Convolutional Neural Networks Learn Succinct Learning Algorithms

Neural Networks (NNs) struggle to efficiently learn certain problems, su...
research
08/23/2017

A Neural Network Approach for Mixing Language Models

The performance of Neural Network (NN)-based language models is steadily...
research
11/11/2022

Neural Network Approaches for Data Estimation in Unique Word OFDM Systems

Data estimation is conducted with model-based estimation methods since t...

Please sign up or login with your details

Forgot password? Click here to reset