An Entropy-based Learning Algorithm of Bayesian Conditional Trees

03/13/2013
by   Dan Geiger, et al.
0

This article offers a modification of Chow and Liu's learning algorithm in the context of handwritten digit recognition. The modified algorithm directs the user to group digits into several classes consisting of digits that are hard to distinguish and then constructing an optimal conditional tree representation for each class of digits instead of for each single digit as done by Chow and Liu (1968). Advantages and extensions of the new method are discussed. Related works of Wong and Wang (1977) and Wong and Poon (1989) which offer a different entropy-based learning algorithm are shown to rest on inappropriate assumptions.

READ FULL TEXT

page 1

page 3

page 4

page 6

research
08/05/2019

Elements of Generalized Tsallis Relative Entropy in Classical Information Theory

In this article, we propose a modification in generalised Tsallis entrop...
research
08/31/2022

Tree-Based Adaptive Model Learning

We extend the Kearns-Vazirani learning algorithm to be able to handle sy...
research
06/01/2020

A Comparison of Empirical Tree Entropies

Whereas for strings, higher-order empirical entropy is the standard entr...
research
06/26/2020

Q-Learning with Differential Entropy of Q-Tables

It is well-known that information loss can occur in the classic and simp...
research
11/25/2017

Selling to a No-Regret Buyer

We consider the problem of a single seller repeatedly selling a single i...
research
01/16/2013

Information Theoretic Learning with Infinitely Divisible Kernels

In this paper, we develop a framework for information theoretic learning...
research
11/22/2012

A hybrid cross entropy algorithm for solving dynamic transit network design problem

This paper proposes a hybrid multiagent learning algorithm for solving t...

Please sign up or login with your details

Forgot password? Click here to reset