Semantic Communication with Adaptive Universal Transformer

08/20/2021
by   Qingyang Zhou, et al.
0

With the development of deep learning (DL), natural language processing (NLP) makes it possible for us to analyze and understand a large amount of language texts. Accordingly, we can achieve a semantic communication in terms of joint semantic source and channel coding over a noisy channel with the help of NLP. However, the existing method to realize this goal is to use a fixed transformer of NLP while ignoring the difference of semantic information contained in each sentence. To solve this problem, we propose a new semantic communication system based on Universal Transformer. Compared with the traditional transformer, an adaptive circulation mechanism is introduced in the Universal Transformer. Through the introduction of the circulation mechanism, the new semantic communication system can be more flexible to transmit sentences with different semantic information, and achieve better end-to-end performance under various channel conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/19/2018

Deep Learning for Joint Source-Channel Coding of Text

We consider the problem of joint source and channel coding of structured...
research
07/23/2023

Transformer-based Joint Source Channel Coding for Textual Semantic Communication

The Space-Air-Ground-Sea integrated network calls for more robust and se...
research
08/01/2023

Adaptive Bitrate Video Semantic Communication over Wireless Networks

This paper investigates the adaptive bitrate (ABR) video semantic commun...
research
05/29/2019

SECRET: Semantically Enhanced Classification of Real-world Tasks

Supervised machine learning (ML) algorithms are aimed at maximizing clas...
research
08/03/2023

XNLP: An Interactive Demonstration System for Universal Structured NLP

Structured Natural Language Processing (XNLP) is an important subset of ...
research
05/18/2023

Rate-Adaptive Coding Mechanism for Semantic Communications With Multi-Modal Data

Recently, the ever-increasing demand for bandwidth in multi-modal commun...
research
10/06/2022

ByteTransformer: A High-Performance Transformer Boosted for Variable-Length Inputs

Transformer is the cornerstone model of Natural Language Processing (NLP...

Please sign up or login with your details

Forgot password? Click here to reset