Language-Oriented Communication with Semantic Coding and Knowledge Distillation for Text-to-Image Generation

09/20/2023
by   Hyelin Nam, et al.
0

By integrating recent advances in large language models (LLMs) and generative models into the emerging semantic communication (SC) paradigm, in this article we put forward to a novel framework of language-oriented semantic communication (LSC). In LSC, machines communicate using human language messages that can be interpreted and manipulated via natural language processing (NLP) techniques for SC efficiency. To demonstrate LSC's potential, we introduce three innovative algorithms: 1) semantic source coding (SSC) which compresses a text prompt into its key head words capturing the prompt's syntactic essence while maintaining their appearance order to keep the prompt's context; 2) semantic channel coding (SCC) that improves robustness against errors by substituting head words with their lenghthier synonyms; and 3) semantic knowledge distillation (SKD) that produces listener-customized prompts via in-context learning the listener's language style. In a communication task for progressive text-to-image generation, the proposed methods achieve higher perceptual similarities with fewer transmissions while enhancing robustness in noisy communication channels.

READ FULL TEXT
research
03/30/2023

KD-DLGAN: Data Limited Image Generation via Knowledge Distillation

Generative Adversarial Networks (GANs) rely heavily on large-scale train...
research
09/08/2023

Sequential Semantic Generative Communication for Progressive Text-to-Image Generation

This paper proposes new framework of communication system leveraging pro...
research
05/09/2023

SUR-adapter: Enhancing Text-to-Image Pre-trained Diffusion Models with Large Language Models

Diffusion models, which have emerged to become popular text-to-image gen...
research
02/28/2020

TextBrewer: An Open-Source Knowledge Distillation Toolkit for Natural Language Processing

In this paper, we introduce TextBrewer, an open-source knowledge distill...
research
10/27/2022

Seq2Seq-SC: End-to-End Semantic Communication Systems with Pre-trained Language Model

While semantic communication is expected to bring unprecedented communic...
research
05/04/2022

Knowledge Distillation of Russian Language Models with Reduction of Vocabulary

Today, transformer language models serve as a core component for majorit...
research
09/14/2020

A Comparison of Two Fluctuation Analyses for Natural Language Clustering Phenomena: Taylor and Ebeling Neiman Methods

This article considers the fluctuation analysis methods of Taylor and Eb...

Please sign up or login with your details

Forgot password? Click here to reset