Adaptive Bitrate Video Semantic Communication over Wireless Networks
This paper investigates the adaptive bitrate (ABR) video semantic communication over wireless networks. In the considered model, video sensing devices must transmit video semantic information to an edge server, to facilitate ubiquitous video sensing services such as road environment monitoring at the edge server in autonomous driving scenario. However, due to the varying wireless network conditions, it is challenging to guarantee both low transmission delay and high semantic accuracy at the same time if devices continuously transmit a fixed bitrate video semantic information. To address this challenge, we develop an adaptive bitrate video semantic communication (ABRVSC) system, in which devices adaptively adjust the bitrate of video semantic information according to network conditions. Specifically, we first define the quality of experience (QoE) for video semantic communication. Subsequently, a swin transformer-based semantic codec is proposed to extract semantic information with considering the influence of QoE. Then, we propose an Actor-Critic based ABR algorithm for the semantic codec to enhance the robustness of the proposed ABRVSC scheme against network variations. Simulation results demonstrate that at low bitrates, the mean intersection over union (MIoU) of the proposed ABRVSC scheme is nearly twice that of the traditional scheme. Moreover, the proposed ABRVSC scheme, which increases the QoE in video semantic communication by 36.57 variations compared to both the fixed bitrate schemes and traditional ABR schemes.
READ FULL TEXT