On Prediction Using Variable Order Markov Models

06/30/2011
by   R. Begleiter, et al.
0

This paper is concerned with algorithms for prediction of discrete sequences over a finite alphabet, using variable order Markov models. The class of such algorithms is large and in principle includes any lossless compression algorithm. We focus on six prominent prediction algorithms, including Context Tree Weighting (CTW), Prediction by Partial Match (PPM) and Probabilistic Suffix Trees (PSTs). We discuss the properties of these algorithms and compare their performance using real life sequences from three domains: proteins, English text and music pieces. The comparison is made with respect to prediction quality as measured by the average log-loss. We also compare classification algorithms based on these predictors with respect to a number of large protein classification tasks. Our results indicate that a "decomposed" CTW (a variant of the CTW algorithm) and PPM outperform all other algorithms in sequence prediction tasks. Somewhat surprisingly, a different algorithm, which is a modification of the Lempel-Ziv compression algorithm, significantly outperforms all algorithms on the protein classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/11/2022

Protein Representation Learning by Geometric Structure Pretraining

Learning effective protein representations is critical in a variety of t...
research
09/30/2011

Comparing Probabilistic Models for Melodic Sequences

Modelling the real world complexity of music is a challenge for machine ...
research
12/05/2020

Pre-training Protein Language Models with Label-Agnostic Binding Pairs Enhances Performance in Downstream Tasks

Less than 1 annotated. Natural Language Processing (NLP) community has r...
research
08/20/2023

SBSM-Pro: Support Bio-sequence Machine for Proteins

Proteins play a pivotal role in biological systems. The use of machine l...
research
05/14/2021

N-ary Huffman Encoding Using High-Degree Trees – A Performance Comparison

In this paper we implement an n-ary Huffman Encoding and Decoding applic...
research
12/30/2019

A New Burrows Wheeler Transform Markov Distance

Prior work inspired by compression algorithms has described how the Burr...
research
03/09/2015

Structured Prediction of Sequences and Trees using Infinite Contexts

Linguistic structures exhibit a rich array of global phenomena, however ...

Please sign up or login with your details

Forgot password? Click here to reset