On Computing the Total Variation Distance of Hidden Markov Models

04/17/2018
by   Stefan Kiefer, et al.
0

We prove results on the decidability and complexity of computing the total variation distance (equivalently, the L_1-distance) of hidden Markov models (equivalently, labelled Markov chains). This distance measures the difference between the distributions on words that two hidden Markov models induce. The main results are: (1) it is undecidable whether the distance is greater than a given threshold; (2) approximation is #P-hard and in PSPACE.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2020

Equivalence of Hidden Markov Models with Continuous Observations

We consider Hidden Markov Models that emit sequences of observations tha...
research
02/28/2023

Learning Hidden Markov Models Using Conditional Samples

This paper is concerned with the computational complexity of learning th...
research
02/05/2019

Exploiting locality in high-dimensional factorial hidden Markov models

We propose algorithms for approximate filtering and smoothing in high-di...
research
07/15/2020

The Big-O Problem for Labelled Markov Chains and Weighted Automata

Given two weighted automata, we consider the problem of whether one is b...
research
06/16/2022

Bias and Excess Variance in Election Polling: A Not-So-Hidden Markov Model

With historic misses in the 2016 and 2020 US Presidential elections, int...
research
08/31/2020

Optimal Bayesian Quickest Detection for Hidden Markov Models and Structured Generalisations

In this paper we consider the problem of quickly detecting changes in hi...
research
04/04/2022

Reliable Editions from Unreliable Components: Estimating Ebooks from Print Editions Using Profile Hidden Markov Models

A profile hidden Markov model, a popular model in biological sequence an...

Please sign up or login with your details

Forgot password? Click here to reset