Efficient Training of Volterra Series-Based Pre-distortion Filter Using Neural Networks

12/13/2021
by   Vinod Bajaj, et al.
0

We present a simple, efficient "direct learning" approach to train Volterra series-based digital pre-distortion filters using neural networks. We show its superior performance over conventional training methods using a 64-QAM 64-GBaud simulated transmitter with varying transmitter nonlinearity and noisy conditions.

READ FULL TEXT
research
11/13/2018

Neural Wavetable: a playable wavetable synthesizer using neural networks

We present Neural Wavetable, a proof-of-concept wavetable synthesizer th...
research
12/18/2019

Computationally Efficient Neural Image Compression

Image compression using neural networks have reached or exceeded non-neu...
research
11/21/2018

Multivariate Forecasting of Crude Oil Spot Prices using Neural Networks

Crude oil is a major component in most advanced economies of the world. ...
research
08/14/2017

Strategic Communication Between Prospect Theoretic Agents over a Gaussian Test Channel

In this paper, we model a Stackelberg game in a simple Gaussian test cha...
research
08/03/2021

Progressive Transmission using Recurrent Neural Networks

In this paper, we investigate a new machine learning-based transmission ...
research
03/15/2021

Lightweight and interpretable neural modeling of an audio distortion effect using hyperconditioned differentiable biquads

In this work, we propose using differentiable cascaded biquads to model ...
research
07/19/2022

Using Neural Networks by Modelling Semi-Active Shock Absorber

A permanently increasing number of on-board automotive control systems r...

Please sign up or login with your details

Forgot password? Click here to reset