Hidden Markov Chains, Entropic Forward-Backward, and Part-Of-Speech Tagging

by   Elie Azeraf, et al.

The ability to take into account the characteristics - also called features - of observations is essential in Natural Language Processing (NLP) problems. Hidden Markov Chain (HMC) model associated with classic Forward-Backward probabilities cannot handle arbitrary features like prefixes or suffixes of any size, except with an independence condition. For twenty years, this default has encouraged the development of other sequential models, starting with the Maximum Entropy Markov Model (MEMM), which elegantly integrates arbitrary features. More generally, it led to neglect HMC for NLP. In this paper, we show that the problem is not due to HMC itself, but to the way its restoration algorithms are computed. We present a new way of computing HMC based restorations using original Entropic Forward and Entropic Backward (EFB) probabilities. Our method allows taking into account features in the HMC framework in the same way as in the MEMM framework. We illustrate the efficiency of HMC using EFB in Part-Of-Speech Tagging, showing its superiority over MEMM based restoration. We also specify, as a perspective, how HMCs with EFB might appear as an alternative to Recurrent Neural Networks to treat sequential data with a deep architecture.



There are no comments yet.


page 1

page 2

page 3

page 4


Introducing the Hidden Neural Markov Chain framework

Nowadays, neural network models achieve state-of-the-art results in many...

Turkish PoS Tagging by Reducing Sparsity with Morpheme Tags in Small Datasets

Sparsity is one of the major problems in natural language processing. Th...

Reduction of Maximum Entropy Models to Hidden Markov Models

We show that maximum entropy (maxent) models can be modeled with certain...

Diversified Hidden Markov Models for Sequential Labeling

Labeling of sequential data is a prevalent meta-problem for a wide range...

Decoding with Finite-State Transducers on GPUs

Weighted finite automata and transducers (including hidden Markov models...

Highly Fast Text Segmentation With Pairwise Markov Chains

Natural Language Processing (NLP) models' current trend consists of usin...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.