
Regenerativity of Viterbi process for pairwise Markov models
For hidden Markov models one of the most popular estimates of the hidden...
read it

A Simple Explanation of A Spectral Algorithm for Learning Hidden Markov Models
A simple linear algebraic explanation of the algorithm in "A Spectral Al...
read it

Gaussian Conditionally Markov Sequences: Dynamic Models and Representations of Reciprocal and Other Classes
Conditionally Markov (CM) sequences are powerful mathematical tools for ...
read it

Compressed Inference for Probabilistic Sequential Models
Hidden Markov models (HMMs) and conditional random fields (CRFs) are two...
read it

A Block Diagonal Markov Model for Indoor SoftwareDefined Power Line Communication
A SemiHidden Markov Model (SHMM) for bursty error channels is defined b...
read it

Epistemic irrelevance in credal nets: the case of imprecise Markov trees
We focus on credal nets, which are graphical models that generalise Baye...
read it

Beyond similarity assessment: Selecting the optimal model for sequence alignment via the Factorized Asymptotic Bayesian algorithm
Pair Hidden Markov Models (PHMMs) are probabilistic models used for pair...
read it
An efficient algorithm for estimating state sequences in imprecise hidden Markov models
We present an efficient exact algorithm for estimating state sequences from outputs (or observations) in imprecise hidden Markov models (iHMM), where both the uncertainty linking one state to the next, and that linking a state to its output, are represented using coherent lower previsions. The notion of independence we associate with the credal network representing the iHMM is that of epistemic irrelevance. We consider as best estimates for state sequences the (WalleySen) maximal sequences for the posterior joint state model conditioned on the observed output sequence, associated with a gain function that is the indicator of the state sequence. This corresponds to (and generalises) finding the state sequence with the highest posterior probability in HMMs with precise transition and output probabilities (pHMMs). We argue that the computational complexity is at worst quadratic in the length of the Markov chain, cubic in the number of states, and essentially linear in the number of maximal state sequences. For binary iHMMs, we investigate experimentally how the number of maximal state sequences depends on the model parameters. We also present a simple toy application in optical character recognition, demonstrating that our algorithm can be used to robustify the inferences made by precise probability models.
READ FULL TEXT
Comments
There are no comments yet.