Achieving Robust Generalization for Wireless Channel Estimation Neural Networks by Designed Training Data

02/05/2023
by   Dianxin Luan, et al.
0

In this paper, we propose a method to design the training data that can support robust generalization of trained neural networks to unseen channels. The proposed design that improves the generalization is described and analysed. It avoids the requirement of online training for previously unseen channels, as this is a memory and processing intensive solution, especially for battery powered mobile terminals. To prove the validity of the proposed method, we use the channels modelled by different standards and fading modelling for simulation. We also use an attention-based structure and a convolutional neural network to evaluate the generalization results achieved. Simulation results show that the trained neural networks maintain almost identical performance on the unseen channels.

READ FULL TEXT
research
03/23/2022

Learning based Channel Estimation and Phase Noise Compensation in Doubly-Selective Channels

In this letter, we propose a learning based channel estimation scheme fo...
research
04/02/2023

CNNs with Multi-Level Attention for Domain Generalization

In the past decade, deep convolutional neural networks have achieved sig...
research
02/08/2023

Channelformer: Attention based Neural Solution for Wireless Channel Estimation and Effective Online Training

In this paper, we propose an encoder-decoder neural architecture (called...
research
03/06/2023

Testing the Channels of Convolutional Neural Networks

Neural networks have complex structures, and thus it is hard to understa...
research
04/03/2020

Exploring the ability of CNNs to generalise to previously unseen scales over wide scale ranges

The ability to handle large scale variations is crucial for many real wo...
research
07/03/2020

Learning to Prune in Training via Dynamic Channel Propagation

In this paper, we propose a novel network training mechanism called "dyn...
research
07/27/2021

Channel-Wise Early Stopping without a Validation Set via NNK Polytope Interpolation

State-of-the-art neural network architectures continue to scale in size ...

Please sign up or login with your details

Forgot password? Click here to reset