Finite-Blocklength Performance of Sequential Transmission over BSC with Noiseless Feedback

02/02/2019
by   Hengjie Yang, et al.
0

In this paper, we consider the expected blocklength of variable-length coding over the binary symmetric channel (BSC) with noiseless feedback. Horstein first proposed a simple one-phase scheme to achieve the capacity of BSC. Naghshvar et al. used a novel extrinsic Jensen-Shannon (EJS) divergence in a sequential transmission scheme that maximizes EJS (MaxEJS) and provided a non-asymptotic upper bound on the expected blocklength for MaxEJS. Simulations in this paper show that MaxEJS provides lower expected blocklengths than the original Horstein scheme, but the non-asymptotic bound of Naghshvar et al. is loose enough that lies above the simulated performance of Horstein scheme for a BSC with a small crossover probability. This paper proposes a new expression for MaxEJS expected blocklength that is a tight approximation of simulated performance. This expression is developed by exploring a genie-aided decoder (GAD) whose expected blocklength will always be larger than MaxEJS and can be approximated by two random walks. We conjecture that even with these two approximations, the expression may still be an upper bound on blocklength as suggested by the simulation results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/30/2021

Sequential Transmission Over Binary Asymmetric Channels With Feedback

In this paper, we consider the problem of variable-length coding over th...
research
05/30/2022

Variable-Length Coding for Binary-Input Channels With Limited Stop Feedback

This paper focuses on the numerical evaluation of the maximal achievable...
research
06/23/2023

Variable-Length Codes with Bursty Feedback

We study variable-length codes for point-to-point discrete memoryless ch...
research
12/07/2017

Feedback Capacity and Coding for the (0,k)-RLL Input-Constrained BEC

The input-constrained binary erasure channel (BEC) with strictly causal ...
research
08/04/2020

Simple Modulo can Significantly Outperform Deep Learning-based Deepcode

Deepcode (H.Kim et al.2018) is a recently suggested Deep Learning-based ...
research
08/16/2021

Approximating the Permanent with Deep Rejection Sampling

We present a randomized approximation scheme for the permanent of a matr...
research
01/15/2020

Improvement of an Approximated Self-Improving Sorter and Error Analysis of its Estimated Entropy

The self-improving sorter proposed by Ailon et al. consists of two phase...

Please sign up or login with your details

Forgot password? Click here to reset