DeepAI AI Chat
Log In Sign Up

Information-Theoretic Reduction of Markov Chains

by   Bernhard C. Geiger, et al.

We survey information-theoretic approaches to the reduction of Markov chains. Our survey is structured in two parts: The first part considers Markov chain coarse graining, which focuses on projecting the Markov chain to a process on a smaller state space that is informativeabout certain quantities of interest. The second part considers Markov chain model reduction, which focuses on replacing the original Markov model by a simplified one that yields similar behavior as the original Markov model. We discuss the practical relevance of both approaches in the field of knowledge discovery and data mining by formulating problems of unsupervised machine learning as reduction problems of Markov chains. Finally, we briefly discuss the concept of lumpability, the phenomenon when a coarse graining yields a reduced Markov model.


page 1

page 2

page 3

page 4


Path-entropy maximized Markov chains for dimensionality reduction

Stochastic kernel based dimensionality reduction methods have become pop...

Mode Reduction for Markov Jump Systems

Switched systems are capable of modeling processes with underlying dynam...

Semi-Supervised Clustering via Markov Chain Aggregation

We connect the problem of semi-supervised clustering to constrained Mark...

Learning from non-irreducible Markov chains

Most of the existing literature on supervised learning problems focuses ...

A supCBI process with application to streamflow discharge and a model reduction

We propose a new stochastic model for streamflow discharge timeseries as...

Reduction of Markov Chains using a Value-of-Information-Based Approach

In this paper, we propose an approach to obtain reduced-order models of ...