Knowledge Distillation Applied to Optical Channel Equalization: Solving the Parallelization Problem of Recurrent Connection

12/08/2022
by   Sasipim Srivallapanondh, et al.
0

To circumvent the non-parallelizability of recurrent neural network-based equalizers, we propose knowledge distillation to recast the RNN into a parallelizable feedforward structure. The latter shows 38% latency decrease, while impacting the Q-factor by only 0.5dB.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/01/2018

On Compressing U-net Using Knowledge Distillation

We study the use of knowledge distillation to compress the U-net archite...
research
04/08/2019

Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization

Recurrent Neural Networks (RNNs) have dominated language modeling becaus...
research
10/20/2020

Fast Video Salient Object Detection via Spatiotemporal Knowledge Distillation

Since the wide employment of deep learning frameworks in video salient o...
research
03/10/2023

Robust Knowledge Distillation from RNN-T Models With Noisy Training Labels Using Full-Sum Loss

This work studies knowledge distillation (KD) and addresses its constrai...
research
09/14/2021

Exploring the Connection between Knowledge Distillation and Logits Matching

Knowledge distillation is a generalized logits matching technique for mo...
research
07/06/2022

Low-resource Low-footprint Wake-word Detection using Knowledge Distillation

As virtual assistants have become more diverse and specialized, so has t...
research
06/24/2022

Towards FPGA Implementation of Neural Network-Based Nonlinearity Mitigation Equalizers in Coherent Optical Transmission Systems

For the first time, recurrent and feedforward neural network-based equal...

Please sign up or login with your details

Forgot password? Click here to reset