A Survey of Applied Machine Learning Techniques for Optical OFDM based Networks

05/07/2021
by   Hichem Mrabet, et al.
0

In this survey, we analyze the newest machine learning (ML) techniques for optical orthogonal frequency division multiplexing (O-OFDM)-based optical communications. ML has been proposed to mitigate channel and transceiver imperfections. For instance, ML can improve the signal quality under low modulation extinction ratio or can tackle both determinist and stochastic-induced nonlinearities such as parametric noise amplification in long-haul transmission. The proposed ML algorithms for O-OFDM can in particularly tackle inter-subcarrier nonlinear effects such as four-wave mixing and cross-phase modulation. In essence, these ML techniques could be beneficial for any multi-carrier approach (e.g. filter bank modulation). Supervised and unsupervised ML techniques are analyzed in terms of both O-OFDM transmission performance and computational complexity for potential real-time implementation. We indicate the strict conditions under which a ML algorithm should perform classification, regression or clustering. The survey also discusses open research issues and future directions towards the ML implementation.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro