Distributed Arithmetic Coding for Sources with Hidden Markov Correlation

01/07/2021
by   Yong Fang, et al.
0

Distributed arithmetic coding (DAC) has been shown to be effective for Slepian-Wolf coding, especially for short data blocks. In this letter, we propose to use the DAC to compress momery-correlated sources. More specifically, the correlation between sources is modeled as a hidden Markov process. Experimental results show that the performance is close to the theoretical Slepian-Wolf limit.

READ FULL TEXT

page 1

page 2

page 3

research
01/19/2021

Hidden Markov Model-Based Encoding for Time-Correlated IoT Sources

As the use of Internet of Things (IoT) devices for monitoring purposes b...
research
11/02/2011

Distributed Lossy Source Coding Using Real-Number Codes

We show how real-number codes can be used to compress correlated sources...
research
07/02/2017

Speaker Identification in a Shouted Talking Environment Based on Novel Third-Order Circular Suprasegmental Hidden Markov Models

It is well known that speaker identification yields very high performanc...
research
10/05/2022

Extending Conformal Prediction to Hidden Markov Models with Exact Validity via de Finetti's Theorem for Markov Chains

Conformal prediction is a widely used method to quantify uncertainty in ...
research
04/07/2020

Arithmetic, Geometry, and Coding Theory: Homage to Gilles Lachaud

We give an overview of several of the mathematical works of Gilles Lacha...
research
10/03/2018

Lattice-based Robust Distributed Source Coding

In this paper, we propose a lattice-based robust distributed source codi...
research
03/06/2023

Improving the Runtime of Algorithmic Polarization of Hidden Markov Models

We improve the runtime of the linear compression scheme for hidden Marko...

Please sign up or login with your details

Forgot password? Click here to reset