On Recurrent Neural Networks for Sequence-based Processing in Communications

05/24/2019
by   Daniel Tandler, et al.
0

In this work, we analyze the capabilities and practical limitations of neural networks (NNs) for sequence-based signal processing which can be seen as an omnipresent property in almost any modern communication systems. In particular, we train multiple state-of-the-art recurrent neural network (RNN) structures to learn how to decode convolutional codes allowing a clear benchmarking with the corresponding maximum likelihood (ML) Viterbi decoder. We examine the decoding performance for various kinds of NN architectures, beginning with classical types like feedforward layers and gated recurrent unit (GRU)-layers, up to more recently introduced architectures such as temporal convolutional networks (TCNs) and differentiable neural computers (DNCs) with external memory. As a key limitation, it turns out that the training complexity increases exponentially with the length of the encoding memory ν and, thus, practically limits the achievable bit error rate (BER) performance. To overcome this limitation, we introduce a new training-method by gradually increasing the number of ones within the training sequences, i.e., we constrain the amount of possible training sequences in the beginning until first convergence. By consecutively adding more and more possible sequences to the training set, we finally achieve training success in cases that did not converge before via naive training. Further, we show that our network can learn to jointly detect and decode a quadrature phase shift keying (QPSK) modulated code with sub-optimal (anti-Gray) labeling in one-shot at a performance that would require iterations between demapper and decoder in classic detection schemes.

READ FULL TEXT
research
10/09/2015

Feedforward Sequential Memory Neural Networks without Recurrent Feedback

We introduce a new structure for memory neural networks, called feedforw...
research
05/22/2019

MIST: A Novel Training Strategy for Low-latencyScalable Neural Net Decoders

In this paper, we propose a low latency, robust and scalable neural net ...
research
10/29/2018

Low-complexity Recurrent Neural Network-based Polar Decoder with Weight Quantization Mechanism

Polar codes have drawn much attention and been adopted in 5G New Radio (...
research
10/02/2019

Deep Learning for Communication over Dispersive Nonlinear Channels: Performance and Comparison with Classical Digital Signal Processing

In this paper, we apply deep learning for communication over dispersive ...
research
11/21/2015

Online Sequence Training of Recurrent Neural Networks with Connectionist Temporal Classification

Connectionist temporal classification (CTC) based supervised sequence tr...
research
09/19/2017

Neural Networks for Text Correction and Completion in Keyboard Decoding

Despite the ubiquity of mobile and wearable text messaging applications,...

Please sign up or login with your details

Forgot password? Click here to reset